To do that you need to have a solution that run with separation model
Then you can change GUI-file that holds layouts and functions very fast and let your data-file stay as it is
1 of 1 people found this helpful
Most of the time, I write my own data migration scripts. I am planing to try Mirror Sync or RefreshFM, but so far I did not have a system to justify using it.
The downtime depends on how much data you need to transfer to the new version.
I tend to do updates during quiet periods, e.g. lunchtime. Another good option is weekend or night-time.
2 of 2 people found this helpful
Another option that works for us for many ongoing maintenance updates:
We set up a section in the database itself where members of our team log "data modifications". In this table we list the file, table and the changes to be made to it in a format that makes copy and paste do as much of the work as possible.
We first develop and (most important) test our updates on offline copies taken from a recent backup of our system and on a "development server" that we have. When ready to "go live", we log what we want in our "data mods" system.
On a weekly scheduled down time that rarely lasts more than an hour and often is much less, one member of our team (usually me), closes the hosted files, opens them in FileMaker and puts in the mods before bringing the system back up. This avoids the need for importing data and minimizes downtime to something our users can live with.