We're looking for ways to move large sets of data from a dev server, to QA, then staging, then production. We have about 1.5 TB of assets stored in externally secure containers and about 500GB of data in a few files, using the separation model.
Our initial idea is the save the assets to the Staging and production servers first, then import any deltas once we move to that environment. Anything to avoid moving 1.5TBs of assets....that's for the assets only. But for the databases.....
We need to automate the push of data from Dev to QA (we can copy the backup files for this step). But then when we go to Staging and Production, we need a way to wipe out the data, import production records back in and verify it all works, with the push of a button so that someone in IT can do this as needed without our intervention. There are currently 34 tables in the main data file, with the largest table having about 400,000 records. When we do a manual import of data, it takes 8+ hours to complete.
I've reviewed Jonn Howell's DevCon 2015 session on this subject (thanks Jonn!). That looks good for smaller apps and less data. Pulling all of the container data back into the FM files each time will take more than 20 hours - unacceptable for this.
I'm checking into RefreshFM from the great folks at Goya, but am having issues getting the demo to run properly.
Is anyone using RefreshFM for their migration? Any feedback? Options or suggestions?
Thanks in advance for your feedback!