1) GetValue ( YourTable::Containerfield ; 2 )
Under certain circumstances, "by reference" container fields can be treated like they are text fields. If you want to see the full contents of the text in such a container field, define a calculation field with text as the return type and just use a reference to the container field as the sole term in the calculation, then place this field on a layout somwheres.
2) This takes several key details. First, use the calculation from 1) in a Set variable script step:
Set variable [$Path ; value: GetValue ( YourTable::Containerfield ; 2 ) ]
Then, when you click the specify datasource checkbox, type in the name of your variable.
3 ) The final key detail is to use add file to add a reference to an actual copy of your file as the second file reference to produce something like this in the dialog box:
The second reference to an actual file on your development machine makes it possible to set up any needed field mapping and other import options even though at that time your $Path variable is empty.
Thanks for the reply. This worked really well for me. I've got it set up now so that I can email the user a new version of the file with a few instructions such as:
A) Take the old file offline (if you have it hosted on FileMaker Server)
B) Open the old file and click "Prep for Update" (a script that does Show All Records in each Table)
C) Open the new file and click "Import Old Data" (this script prompts them to select the old file, then does all the importing, updating of Next Serial Values, and recreating of user accounts.
1) Is there any way to automate step B? For example, can I somehow use the $Path variable to open the old file and then run the "Prep for Update" script automatically on the old file from the new file as part of my "Import Old Data" script? This way the user would only need to select the old file and the system would do the rest.
2) I had to make sure that there was an account in both files with the same account name and password or else the user was being prompted to log in to the old file at each Import Records script step. Since this is a veritcal market solution and each client will need to have a different admin account passwords, I decided to maintain a separate list of client admin passwords so that I could include their specific admin password in new versions of the file that I send to them. This is not really ideal because it means the client can't change their admin password or else it would no longer match my list, but this was a better trade off to me than them needing to log in 25 times when importing their old data. Any tips on this challenge?
3) I'm struggling a bit with trusting this import process to the users. I'm not sure how best to trap and prevent errors in the script such as if the user selects the wrong file for their old data, etc.
1) IF the old file name is known, then you can use Perform script to perform this script from the new file. The tricky part is to set things up so that the relative file path from new to old file is predictable. Putting both files in the same folder is one option that can work for this. Using the $path would be awesome, but it doesn't work for performing the script.
2) The oridinal file should be kept open during the import process and then the user should only be asked once for a password. Even with matching passwords, if the file is closed when this script runs, every import records step prompts for a password. This is truly unfortunate because if we could leave the original file closed, we would not need the prepare for import step. (Import records from a closed file imports all the records from that table instead of just the found set.)
3) There are several things you can experiment with. One is to have your script check to see if certain fields have any data after import. This can tell you if a completely wrong file was selected. You can also set up a utility table hidden from the user that stores a version number. You can import a record from this table first and then check this version number to confirm that they have imported the correct file. You can also set up a script in the original file that gets performed as part of the prepare for import that saves a copy of the original file to a different name. This can be a "disaster recovery" copy if the import goes completely awry. Then your script can even use save a copy as to overwrite the original file with a copy of the updated file once import is complete. (This makes it possible to give the update copy a different name so that both can be placed in the same folder.
It's also an option to use an installer untility to move and rename files as part of the deployed update.
And the use of a data separation model can avoid the need for data imports in all cases except where you have to modify the design of a table or data level relationship.
Great tips. Thanks again.