← Back to team overview

dhis2-devs team mailing list archive

Re: Creating Sync between Linode(External Server) and Local Server

 

We’ve set it up for clients where we can script a DB moving automatically from one server to another and it is automated  entirely.  

Typically we have scripts that will do a dump without analytics (e.g. pg_dump -T analytics* -T completeness* dhis2 | /usr/bin/gzip -c > /tmp/dhis2.backup.gz), so it is much smaller of a backup to move (since bandwidth is a consideration). We then transfer that securely using key pairs to the new server and it drops the existing DB there (after backing it up) and imports the new one.  We typically schedule this transfer as a cron job to run nightly or during off-peak hours from a bash script, since analytics also need to be re-run on the local server once the DB is moved as well.

There are many ways to script this and rsync can work, pg_dump can also backup on one machine and restore to another (but we highly recommend keys to keep it secure), scp, etc.   

Steffen Tengesdal
BAO Systems


> On Dec 18, 2014, at 7:13 AM, gerald thomas <gerald17006@xxxxxxxxx> wrote:
> 
> Bob,
> My Suggestion:
> All local servers must be on 2.15 war file then we create a SFTP
> account on cloud server then we can use filezilla from the local
> server to download the backup from the cloud server.
> I know it is crude but that help for now.
> What is your take Bob.
> 
> On 12/18/14, Bob Jolliffe <bobjolliffe@xxxxxxxxx <mailto:bobjolliffe@xxxxxxxxx>> wrote:
>> Hi Gerald
>> 
>> We tested this when I was in Sierra Leone and we were finding serious
>> problems with bandwidth getting the data back to Sierra Leone.
>> 
>> So you are going to have to think carefully about when and how often to
>> synch.  Currently your database files are very small as you don't have much
>> data on your cloud server, but it will soon grow.  I suspect "at least
>> twice a day" sounds unrealistic.
>> 
>> The way I typically do it is to first create an account on the backup
>> server.  Make sure that the account running your dhis instance can login to
>> the backup server without a password by creating an ssh key pair and
>> installing the public key on the backup server account.  Then you can
>> simply the rsync the backups directory (eg /var/lib/dhis2/dhis/backups) to
>> a directory on the backup server using cron.   In fact if you look in
>> /usr/bin/dhis2-backup you will see that the commands are already there to
>> do this, just commented out.  This would synch with the backup server after
>> taking the nightly backup.
>> 
>> This simple (and slightly lazy) setup has worked fine, and continues to
>> work, in a number of places.  But there are a number of reasons you might
>> want to do something different.
>> 
>> (i)  you might want to pull from the backup server rather than push to it.
>> Particularly as the backup server might not be as reliably always online as
>> the production server.  This would require a slightly different variation
>> on the above, but using the same principle of creating an ssh keypair and
>> letting rsync do the work.
>> 
>> (ii) rsync is a really great and simple tool, but it is sadly quite slow.
>> If you are bandwidth stressed and your database is growing it might not be
>> the best solution.  Works fine when bandwidth is not a critical issue.  The
>> trouble is it doesn't really take into account the incremental nature of
>> the data ie. you backup everything every time (besides the ephemeral tables
>> like analytics, aggregated etc).  In which case you need to start thinking
>> smarter and maybe a little bit more complicated.  One approach I have been
>> considering, (but not yet tried) is to make a copy of the metadata export
>> every night and then just pull all the datavalues with a lastupdated
>> greater than the last time you pulled.  That is going to reduce the size of
>> the backup quite considerably.  In theory this is probably even possible to
>> do through the api rather than directly through psql which might be fine if
>> you choose the time of day/night carefully.  I'd probably do it with psql
>> at the backed,
>> 
>> So there are a few options.  The first being the simplest and also the
>> crudest.  Any other thoughts?
>> 
>> Cheers
>> Bob
>> 
>> On 18 December 2014 at 05:07, gerald thomas <gerald17006@xxxxxxxxx> wrote:
>>> 
>>> Dear All,
>>> Sierra Leone wants to finally migrate to an online server (External
>>> server hosted outside the Ministry) but we will like to create a daily
>>> backup of that server locally in case anything goes wrong.
>>> My questions:
>>> 
>>> 1.  We need a help with a script that can create a sync between the
>>> External Server and the Local Server (at least twice a day)
>>> 
>>> 2. Is there something we should know from past experiences about
>>> hosting servers on the cloud
>>> 
>>> Please feel free to share anything and I will be grateful to learn new
>>> things about dhis2
>>> 
>>> --
>>> Regards,
>>> 
>>> Gerald
>>> 
>>> _______________________________________________
>>> Mailing list: https://launchpad.net/~dhis2-devs
>>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx
>>> Unsubscribe : https://launchpad.net/~dhis2-devs
>>> More help   : https://help.launchpad.net/ListHelp
>>> 
>> 
> 
> 
> -- 
> Regards,
> 
> Gerald
> 
> _______________________________________________
> Mailing list: https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx <mailto:dhis2-devs@xxxxxxxxxxxxxxxxxxx>
> Unsubscribe : https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
> More help   : https://help.launchpad.net/ListHelp <https://help.launchpad.net/ListHelp>

Follow ups

References