Efficiently manage large backups

Discussion created by velistar on Apr 11, 2018
Latest reply on Apr 11, 2018 by wimdecorte

Hi all,


I have a few databases hosted on FMS with a pretty large number of files stored in container fields. These are all externally stored and secure. As a result there is a very large number of folders, that is around 160k and around 26GB in total (DBs and files/folders).


Everything runs just fine, it is just the daily and weekly backups are equally massive with the same bunch of files/folders again and again. Now I want to have the ability to send these backups offsite and in particular to AWS S3. The real problem is that transferring 160k in files is quite cumbersome compared to 1 large 26GB file.


I tried compressing a backup folder and it takes ages, close to 1 hour! This would could be done automatically late at night after the end of day backup anyway and then sync the file to S3 which is another short command anyway.


I am thinking that there is possibly another way to do this process more efficiently or if you have an alternative strategy.


Although I am not certain entirely, if I move the container field folders outside of the main Database folder, would FMS know that that folder is related to the DB and back it up? If not then it would only backup the DBs which is ok as they are only a few files that are easily compressible and uploadable to S3. The remaining container folder data can be synced entirely to S3 without keeping multiple copies locally. This would give another problem though in case of recovery of a stored file as there would no way to know which file to retrieve from S3, which kind of beats the purpose!


Any input would help!