Thanks. PhilModjunk. Did not know that!
So, to ask this a different way, would data separation be the best way to go about ongoing development such as mine?
I would recommend that you do not do development on a live Databases. You are asking for trouble. Small tweaks to a layout is one thing, but a typo or a bug in a script can mess up your data file in a hurry.
I would suggest making backup copies, then make changes to one of the backups then test then Install new blank copy as the new live copy and import your data. Few more steps, unless something bad happens, you could work for days trying to fix misplace data.
Data separation enables you to deploy new interface files simply by swapping out the old file for the new. There can be technical issues that arise specifically from using data separation--such as how you manage accounts and passwords in two files rather than one, but generally speaking this simplifies some update tasks.
But changes to your data file still require deploying a clone of the new file with a script set up to import all records from your old data file into the new.
You may also want to include an encrypted file with a table of all users and passwords with scripted setup for creating/changing passwords and selecting privilege sets. That way, you can import account information into a new file or just have a script loop through the records in this table to recreate accounts in the newly deployed interface or data file.
Thanks again. I'll have to do some more research and reading on this topic, I guess.
I am not sure what you mean by this statement:
"But changes to your data file still require deploying a clone of the new file with a script set up to import all records from your old data file into the new."
What do you mean by changes to the data file? That seems to contradict the idea of separation of data and structure and the ability to just swap the structure file. Are you saying that I would still need to export import every time I deploy a new structure file? Or is this statement meaning if I did not do the separation, then I would be needing to do this.
If you make a change to the data file, ie add a field, change a calculation, etc. Then the OLD DATA has to be imported to the NEW FILE. In data-separation model, changes to the interface file is a simple matter of replace the old file with the new.
Much depends on your particular set up. Many of us are responsible for a single hosted database solution others have to support multiple clients all using a copy of a single solution design. I currently work on a team that supports such a single hosted solution.
Single hosted solution:
We maintain two hosted copies of our solution each on a different server. We develop and test a design change on one copy that is not accessible to our user base and when we are satisfied that our design change works, we document the data level changes needed, (The stuff that requires opening Manage | Database) and then we, late at night and with prior warning to the user base, user Server Admin Console to close the file, open it with FileMaker Advanced and make the "Data mods" right in the copy that we host, but while it is not actually open and hosted. We then quite FM Advanced and re-open the file on the server--making script and layout design updates in the live database. I freely admit that this still has a bit of risk to it, but we make a lot of backups and this has been a workable compromise for us.
Multiple clients with different copies:
In this scenario, you will be sending out new copies of your file to your clients each time you complete work on an update that you need them to have. In this scenario, data separation means that you can send them a UI file and they will just need to swap files to update their solution. But if you need to send them a new data file due to having to make design changes to the data file, then you'll need to send them an empty copy of the database with a scripted update process that will import all of the data from their current copy into the new. The same script will need to update next serial value settings if you use serial number fields for your primary keys. If you use UUIDs, you will not need to do that part.
The added complication here is that you can't import accounts from Manage | Security from the old file into the new. Thus, in this situation, you might want to set up a table that has a record for each account name, password and privilege set so that you can import this data into your new solution and then a script can recreate these accounts in the new copy of the file. (unfortunately, this doesn't work for externally authenticated accounts.)
I consider a live database, one that users are currently entering data into. In your example no new data is being entered so a backup would put you back where you started. A company that has 50 to 100 clients data being enter and then a mishap with a script could create a lot of extra work. I'm sure your FM expertise level is higher than most user on this forum too, so I'm sure you make less mistakes scripting. Sometimes I script to much at night without enough sleep and have made some stupid mistakes in scripts. Other times mistakes just happen.
Backups can save the day.
All valid concerns and it depends on the script, your users' workflows and how you edit the script. I usually disable the old code but leave it in place with the new script steps left in place just above it with comments that identify each. In case of unexpected problems, I can then rapidly revert the script be disabling the new and enabling the old. And keep in mind that the script change has been previously tested on the development copy so we have a pretty high confidence level that we aren't going to "break" something with the change.
That still leaves the possibility of catching a user "between stools" with an unexpected script or layout change so staying in good communication with your users and modifying the script at the correct point in time are also important.