← Back to team overview

dhis2-devs team mailing list archive

Re: Creating Sync between Linode(External Server) and Local Server

 

Hi All,

I think there are two concerns being discussed here.

1) Making sure there is a reliable backup in case something goes wrong.

The first problem is pretty straight forward, one can create another instance in another region, another provider or locally. Then schedule a regular backup to that server. Though I don’t recommend that the local actively run DHIS 2 because any changes made to that server will be lost on the next update from the cloud instance. Merging DBs is a difficult problem and causes more headache than it is worth.

Depending on how far back you’d like your backups to go this will start to consume a lot of disk space.

If the cloud server goes down you can be assured that your data is safe because you’ll have a copy of the database either on another cloud server or locally.

Incremental backups can be good for low bandwidth but my concerns are restore time and if one of the increments is corrupted it can cause a lot of problems.

Some cloud providers also offer storage/backup solutions that can address this concern. 

2) Failover in the event the cloud server goes down.

This is a more complex problem and can be addressed by having stand by servers in different regions, this will allow for failover in the event of an outage but has to be carefully planned and starts to get expensive as you’ve essentially doubled or tripled the number of instances/servers you’d need available. It also requires careful planning to make sure there is clear failover plan in addition to a clear plan to restore to the initial setup. 

—
Executive summary 
1) Reliable backups are pretty straight forward and can be cost effective. 
2) Failure over can be addressed but it is complex problem and starts to get expensive.

Lastly and more importantly is to test on regular basis to make sure that you are able to restore from backups in the event of a failure.

Thanks,
Dan



Dan Cocos
BAO Systems
www.baosystems.com <http://www.baosystems.com/>
T: +1 202-352-2671 | skype: dancocos

> On Dec 18, 2014, at 7:53 AM, Steffen Tengesdal <steffen@xxxxxxxxxxxxx> wrote:
> 
> Hi Gerald,
> 
> As Bob pointed out, filezilla is a GUI tool and it does not support scheduling of downloads.  Your local server should not have a GUI on it if it is a production system.  If your local host a Linux system? If so, you can create a simple bash script on the localhost system that uses sftp or scp command line to connect and download a backup.  A script for that would not be very complicated.  
> 
> Steffen
> 
>> On Dec 18, 2014, at 7:47 AM, gerald thomas <gerald17006@xxxxxxxxx <mailto:gerald17006@xxxxxxxxx>> wrote:
>> 
>> Bob,
>> My Suggestion:
>> All local servers must be on 2.15 war file then we create a SFTP
>> account on cloud server then we can use filezilla from the local
>> server to download the backup from the cloud server.
>> I know it is crude but that help for now.
>> What is your take Bob.
>> 
>> On 12/18/14, Bob Jolliffe <bobjolliffe@xxxxxxxxx <mailto:bobjolliffe@xxxxxxxxx>> wrote:
>>> Hi Gerald
>>> 
>>> We tested this when I was in Sierra Leone and we were finding serious
>>> problems with bandwidth getting the data back to Sierra Leone.
>>> 
>>> So you are going to have to think carefully about when and how often to
>>> synch.  Currently your database files are very small as you don't have much
>>> data on your cloud server, but it will soon grow.  I suspect "at least
>>> twice a day" sounds unrealistic.
>>> 
>>> The way I typically do it is to first create an account on the backup
>>> server.  Make sure that the account running your dhis instance can login to
>>> the backup server without a password by creating an ssh key pair and
>>> installing the public key on the backup server account.  Then you can
>>> simply the rsync the backups directory (eg /var/lib/dhis2/dhis/backups) to
>>> a directory on the backup server using cron.   In fact if you look in
>>> /usr/bin/dhis2-backup you will see that the commands are already there to
>>> do this, just commented out.  This would synch with the backup server after
>>> taking the nightly backup.
>>> 
>>> This simple (and slightly lazy) setup has worked fine, and continues to
>>> work, in a number of places.  But there are a number of reasons you might
>>> want to do something different.
>>> 
>>> (i)  you might want to pull from the backup server rather than push to it.
>>> Particularly as the backup server might not be as reliably always online as
>>> the production server.  This would require a slightly different variation
>>> on the above, but using the same principle of creating an ssh keypair and
>>> letting rsync do the work.
>>> 
>>> (ii) rsync is a really great and simple tool, but it is sadly quite slow.
>>> If you are bandwidth stressed and your database is growing it might not be
>>> the best solution.  Works fine when bandwidth is not a critical issue.  The
>>> trouble is it doesn't really take into account the incremental nature of
>>> the data ie. you backup everything every time (besides the ephemeral tables
>>> like analytics, aggregated etc).  In which case you need to start thinking
>>> smarter and maybe a little bit more complicated.  One approach I have been
>>> considering, (but not yet tried) is to make a copy of the metadata export
>>> every night and then just pull all the datavalues with a lastupdated
>>> greater than the last time you pulled.  That is going to reduce the size of
>>> the backup quite considerably.  In theory this is probably even possible to
>>> do through the api rather than directly through psql which might be fine if
>>> you choose the time of day/night carefully.  I'd probably do it with psql
>>> at the backed,
>>> 
>>> So there are a few options.  The first being the simplest and also the
>>> crudest.  Any other thoughts?
>>> 
>>> Cheers
>>> Bob
>>> 
>>> On 18 December 2014 at 05:07, gerald thomas <gerald17006@xxxxxxxxx <mailto:gerald17006@xxxxxxxxx>> wrote:
>>>> 
>>>> Dear All,
>>>> Sierra Leone wants to finally migrate to an online server (External
>>>> server hosted outside the Ministry) but we will like to create a daily
>>>> backup of that server locally in case anything goes wrong.
>>>> My questions:
>>>> 
>>>> 1.  We need a help with a script that can create a sync between the
>>>> External Server and the Local Server (at least twice a day)
>>>> 
>>>> 2. Is there something we should know from past experiences about
>>>> hosting servers on the cloud
>>>> 
>>>> Please feel free to share anything and I will be grateful to learn new
>>>> things about dhis2
>>>> 
>>>> --
>>>> Regards,
>>>> 
>>>> Gerald
>>>> 
>>>> _______________________________________________
>>>> Mailing list: https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
>>>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx <mailto:dhis2-devs@xxxxxxxxxxxxxxxxxxx>
>>>> Unsubscribe : https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
>>>> More help   : https://help.launchpad.net/ListHelp <https://help.launchpad.net/ListHelp>
>>>> 
>>> 
>> 
>> 
>> -- 
>> Regards,
>> 
>> Gerald
>> 
>> _______________________________________________
>> Mailing list: https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx <mailto:dhis2-devs@xxxxxxxxxxxxxxxxxxx>
>> Unsubscribe : https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
>> More help   : https://help.launchpad.net/ListHelp <https://help.launchpad.net/ListHelp>
> _______________________________________________
> Mailing list: https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx <mailto:dhis2-devs@xxxxxxxxxxxxxxxxxxx>
> Unsubscribe : https://launchpad.net/~dhis2-devs <https://launchpad.net/~dhis2-devs>
> More help   : https://help.launchpad.net/ListHelp <https://help.launchpad.net/ListHelp>

Follow ups

References