FileMaker Offsite Backups and large Remote Container storage

Discussion created by taylorsharpe on Jul 30, 2015
Latest reply on Sep 5, 2017 by mvangieson10

I have worked up a backup system I'm testing on a development server for off site backups.  It would be nice if StandBy server would work this way, but I understand it is really only designed for another computer on the same subnet. 


I used to use 360Works' SafetyNet, which was pretty slick.  It basically just watches a file or folder and backs it up to the Amazon cloud.  Pretty great idea except that it copies the entire database each time.  This is not bad if you have a medium sized database.  But if you have a large database, especially with a lot of remote container storage, then copying the whole file every time gets unwieldy if not impossible depending on your internet speeds.  It sure would be nice if it did incremental syncing. 


What is needed are incremental backups, especially for the remote container storage where most of the data does not change.  One thing we know from FileMaker is that you do not ever want to scan, copy or otherwise touch a live FileMaker database.  But I'm pretty sure that only applies to the database and not the remote container storage (confirmation????). 


I'm am using a Mac and like Carbon Copy Clone (CCC) which is a program that can copy a whole disk or, in my case, watch a folder and its subfolders and copy only the changes to a remote storage location (similar to SafetyNet).  What I have done is created a CCC task that watches the live remote container storage for this one file and does a backup each night only syncing the changed files.  This goes MUCH faster than copying the entire remote container storage. 


Separately I wrote a schedule script using the MBS plugin on the server that looks for the latest backup of this file in the backups folder and copies that file to another folder on the server.  CCC watches that folder and backups up that FileMaker file each night (*.fmp12 file, not any remote container storage). 


This seems to address the problem of a FM file with a lot of remote container storage and only syncing (copying/deleting) the changes from day to day and not the entire remote container storage.  Since it has a lot less to transfer, this process is working well across the internet for remote container storage.  I now get both the FM file and the remote container storage backed up off site each night. 


Obviously the easiest thing is to just copy all of your backups, but as your remote container storage grows, the copying can take really long.  This is a work around to minimize how much has to be copied each time making remote backups work when it would take many hours to sync the entire remote container storage each night. 


Probably the trickiest part was making the script to watch the backups folder and find the latest backup and copy that file to another folder.  And while I use MBS, you can use any server plugin that allows you to manipulate files and list folder contents.  Just remember the file you are copying, and the folder it is going to, must be accessible (read/write access) by fmserver, especially if you are on a Mac.  If someone is interested, I can post that script. 


Note that CCC is good for connecting to other servers for syncing (e.g., SMB, AFP and maybe FTP), but it does not support Amazon storage, which has its own protocol. 


I'm open to any better suggestions on off-site backups through the web where there is a limited bandwidth.