← Back to team overview

dhis2-devs team mailing list archive

Re: [Dhis2-users] Creating Sync betweenLinode(External Server) and Local Server

 

Bob,
Sorry about the GUI application I recommend. I was only trying to make a
point to explain my idea and also thinking of those regional servers
(because they are using desktop Ubuntu) .
Bob,
Your account is still there and you can ssh .
Sorry all for my late responses but I will be giving more input in a hour
or two from now.

Regards,
Gerald
On 18 Dec 2014 14:18, "Bob Jolliffe" <bobjolliffe@xxxxxxxxx> wrote:

>  [image: Boxbe] <https://www.boxbe.com/overview> This message is eligible
> for Automatic Cleanup! (bobjolliffe@xxxxxxxxx) Add cleanup rule
> <https://www.boxbe.com/popup?url=https%3A%2F%2Fwww.boxbe.com%2Fcleanup%3Ftoken%3DYrWBpRQqbMBiU5qbWYJzc32uXx%252BcSKIWGRLmIpEw4AjHkbzEB%252BbfJqEdPmuj9g7ovrYTheiJlb7WzajATgkgn7HX4STWKXaOGbFmzzhdQQGGu3kKERK3aHydYy3z7wUhgAF2cUZGfo7MfZWu3IxyOw%253D%253D%26key%3DU3%252FPnPPnRczXpncME0ndTXVz9ErXN5%252BNRwv3jSSxIok%253D&tc_serial=19740292685&tc_rand=764606716&utm_source=stf&utm_medium=email&utm_campaign=ANNO_CLEANUP_ADD&utm_content=001>
> | More info
> <http://blog.boxbe.com/general/boxbe-automatic-cleanup?tc_serial=19740292685&tc_rand=764606716&utm_source=stf&utm_medium=email&utm_campaign=ANNO_CLEANUP_ADD&utm_content=001>
>
> I think Steffen put his finger on it when he said that the backup should
> be restored (and hence tested) as part of the same scripted operation.  But
> you make a good point about not having a dhis2 instance running live
> against that database as it would disturb the integrity of the backup.
>
> Its also important to have a notion of generations of backup.  If you just
> have the production database and the backup, then when things go bad on the
> production server you don't want to overwrite your good backup with a bad
> one.
>
> You can't keep daily backups forever as you will rapidly run out of space
> or budget.  My preference is to keep:
> 6 days of daily backups
> 6 weeks of weekly backups
> some number of monthly backups
> etc
>
> This way as you roll into the future your disk usage doesn't grow too
> rapidly.
>
> On 18 December 2014 at 13:27, Dan Cocos <dan@xxxxxxxxxxxx> wrote:
>>
>> Hi All,
>>
>> I think there are two concerns being discussed here.
>>
>> 1) Making sure there is a reliable backup in case something goes wrong.
>>
>> The first problem is pretty straight forward, one can create another
>> instance in another region, another provider or locally. Then schedule a
>> regular backup to that server. Though I don’t recommend that the local
>> actively run DHIS 2 because any changes made to that server will be lost on
>> the next update from the cloud instance. Merging DBs is a difficult problem
>> and causes more headache than it is worth.
>>
>> Depending on how far back you’d like your backups to go this will start
>> to consume a lot of disk space.
>>
>> If the cloud server goes down you can be assured that your data is safe
>> because you’ll have a copy of the database either on another cloud server
>> or locally.
>>
>> Incremental backups can be good for low bandwidth but my concerns are
>> restore time and if one of the increments is corrupted it can cause a lot
>> of problems.
>>
>> Some cloud providers also offer storage/backup solutions that can address
>> this concern.
>>
>> 2) Failover in the event the cloud server goes down.
>>
>> This is a more complex problem and can be addressed by having stand by
>> servers in different regions, this will allow for failover in the event of
>> an outage but has to be carefully planned and starts to get expensive as
>> you’ve essentially doubled or tripled the number of instances/servers you’d
>> need available. It also requires careful planning to make sure there is
>> clear failover plan in addition to a clear plan to restore to the initial
>> setup.
>>
>> —
>> Executive summary
>> 1) Reliable backups are pretty straight forward and can be cost
>> effective.
>> 2) Failure over can be addressed but it is complex problem and starts to
>> get expensive.
>>
>> Lastly and more importantly is to test on regular basis to make sure that
>> you are able to restore from backups in the event of a failure.
>>
>> Thanks,
>> Dan
>>
>>
>>
>> *Dan Cocos*
>> BAO Systems
>> www.baosystems.com
>> T: +1 202-352-2671 | skype: dancocos
>>
>> On Dec 18, 2014, at 7:53 AM, Steffen Tengesdal <steffen@xxxxxxxxxxxxx>
>> wrote:
>>
>> Hi Gerald,
>>
>> As Bob pointed out, filezilla is a GUI tool and it does not support
>> scheduling of downloads.  Your local server should not have a GUI on it if
>> it is a production system.  If your local host a Linux system? If so, you
>> can create a simple bash script on the localhost system that uses sftp or
>> scp command line to connect and download a backup.  A script for that would
>> not be very complicated.
>>
>> Steffen
>>
>> On Dec 18, 2014, at 7:47 AM, gerald thomas <gerald17006@xxxxxxxxx> wrote:
>>
>> Bob,
>> My Suggestion:
>> All local servers must be on 2.15 war file then we create a SFTP
>> account on cloud server then we can use filezilla from the local
>> server to download the backup from the cloud server.
>> I know it is crude but that help for now.
>> What is your take Bob.
>>
>> On 12/18/14, Bob Jolliffe <bobjolliffe@xxxxxxxxx> wrote:
>>
>> Hi Gerald
>>
>> We tested this when I was in Sierra Leone and we were finding serious
>> problems with bandwidth getting the data back to Sierra Leone.
>>
>> So you are going to have to think carefully about when and how often to
>> synch.  Currently your database files are very small as you don't have
>> much
>> data on your cloud server, but it will soon grow.  I suspect "at least
>> twice a day" sounds unrealistic.
>>
>> The way I typically do it is to first create an account on the backup
>> server.  Make sure that the account running your dhis instance can login
>> to
>> the backup server without a password by creating an ssh key pair and
>> installing the public key on the backup server account.  Then you can
>> simply the rsync the backups directory (eg /var/lib/dhis2/dhis/backups) to
>> a directory on the backup server using cron.   In fact if you look in
>> /usr/bin/dhis2-backup you will see that the commands are already there to
>> do this, just commented out.  This would synch with the backup server
>> after
>> taking the nightly backup.
>>
>> This simple (and slightly lazy) setup has worked fine, and continues to
>> work, in a number of places.  But there are a number of reasons you might
>> want to do something different.
>>
>> (i)  you might want to pull from the backup server rather than push to it.
>> Particularly as the backup server might not be as reliably always online
>> as
>> the production server.  This would require a slightly different variation
>> on the above, but using the same principle of creating an ssh keypair and
>> letting rsync do the work.
>>
>> (ii) rsync is a really great and simple tool, but it is sadly quite slow.
>> If you are bandwidth stressed and your database is growing it might not be
>> the best solution.  Works fine when bandwidth is not a critical issue.
>> The
>> trouble is it doesn't really take into account the incremental nature of
>> the data ie. you backup everything every time (besides the ephemeral
>> tables
>> like analytics, aggregated etc).  In which case you need to start thinking
>> smarter and maybe a little bit more complicated.  One approach I have been
>> considering, (but not yet tried) is to make a copy of the metadata export
>> every night and then just pull all the datavalues with a lastupdated
>> greater than the last time you pulled.  That is going to reduce the size
>> of
>> the backup quite considerably.  In theory this is probably even possible
>> to
>> do through the api rather than directly through psql which might be fine
>> if
>> you choose the time of day/night carefully.  I'd probably do it with psql
>> at the backed,
>>
>> So there are a few options.  The first being the simplest and also the
>> crudest.  Any other thoughts?
>>
>> Cheers
>> Bob
>>
>> On 18 December 2014 at 05:07, gerald thomas <gerald17006@xxxxxxxxx>
>> wrote:
>>
>>
>> Dear All,
>> Sierra Leone wants to finally migrate to an online server (External
>> server hosted outside the Ministry) but we will like to create a daily
>> backup of that server locally in case anything goes wrong.
>> My questions:
>>
>> 1.  We need a help with a script that can create a sync between the
>> External Server and the Local Server (at least twice a day)
>>
>> 2. Is there something we should know from past experiences about
>> hosting servers on the cloud
>>
>> Please feel free to share anything and I will be grateful to learn new
>> things about dhis2
>>
>> --
>> Regards,
>>
>> Gerald
>>
>> _______________________________________________
>> Mailing list: https://launchpad.net/~dhis2-devs
>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx
>> Unsubscribe : https://launchpad.net/~dhis2-devs
>> More help   : https://help.launchpad.net/ListHelp
>>
>>
>>
>>
>> --
>> Regards,
>>
>> Gerald
>>
>> _______________________________________________
>> Mailing list: https://launchpad.net/~dhis2-devs
>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx
>> Unsubscribe : https://launchpad.net/~dhis2-devs
>> More help   : https://help.launchpad.net/ListHelp
>>
>>
>> _______________________________________________
>> Mailing list: https://launchpad.net/~dhis2-devs
>> Post to     : dhis2-devs@xxxxxxxxxxxxxxxxxxx
>> Unsubscribe : https://launchpad.net/~dhis2-devs
>> More help   : https://help.launchpad.net/ListHelp
>>
>>
>>
>> _______________________________________________
>> Mailing list: https://launchpad.net/~dhis2-users
>> Post to     : dhis2-users@xxxxxxxxxxxxxxxxxxx
>> Unsubscribe : https://launchpad.net/~dhis2-users
>> More help   : https://help.launchpad.net/ListHelp
>>
>>
> _______________________________________________
> Mailing list: https://launchpad.net/~dhis2-users
> Post to     : dhis2-users@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~dhis2-users
> More help   : https://help.launchpad.net/ListHelp
>
>

References