I suggest taking a closer look at what is going on. The found sets produced by another user are not visible to a second user opening the database at the same time so what you describe does not seem possible.
But a found set on a table produced by User A will be the records imported by User A...
The files are separate databases.
The files are stored on a network drive, but are opened locally by each user.
All files are accessible to all users on a shared network drive, but people are in charge of their own database file.
Each file can only be open by one user at a time (we aren't using any form of filesharing and aren't running FMPro server).
Am I only seeing the error because I am the one accessing both databases a the same time then?
i.e. If I open user A file, and do a search. Keep it open, then open user B file and do import all records from all databases (the above script), it pulls in all records from all databases (that are currently closed), but only the ones I am currently viewing in user A database. Is that because I am the one with both files open?
OK. I sussed it out. If user A has file A open, and user B has file B open and user B tries to deploy the import script, then the import script errors because it is unable to access (and fails to import the records from) user the user A database. I wasn't seeing this error, since I was the user opening both files locally, and instead the script was running fully but only importing a subset of records.
So this has solved one problem, but created another.
Perhaps one solution is for the script to create a temporary duplicate of each database (one assumes this can be done even if they are open), and pull in records from this temporary location.
Q: Is there a FM script command I can use to duplicate (and rename) files?
These files should not be accessible directly from a shared directory. According to FileMaker Tech Support, this can result in damaged files if two users try to open the file at the same time.
You should enable sharing for your files and then open them on the computer where they are located with either FileMaker Pro or FileMaker Server. Others should then access the DB via Open Remote so that more than one user can access the same file at the same time.
Having your data distributed amongst multiple copies of the same file is a recipe for major migraine headaches when it comes to working with the data. Sometimes it's unavoidable, but in cases where this is necessary, the data should be regularly "synched" back to central file where you can perform your searches etc without having to use your current method. "Synching" the data requires either creating your own system for importing the data or using a 3rd party produced synch tool such as those provided by 360words and SeedCode. They were intended for use with iOS devices, but as far as I know, they can also be used with FileMaker Pro as well.
Thanks for your feedback Phil. The files are all located on a central University Filestore that all users have access to. We are unable to run any files directly on that machine. The Filestore structure itself prevents two users from accessing the file at the same time, so corruption should be prevented (I am aware of the potential issues, but since we cannot locate/run the files on a central location/computer where all users have access, we don't have another option. i.e. we don't have the ability to fileshare between local computers).
Note: each file has unique data in it - nothing is duplicated - it is just of the same format - so to search all databases, we just need to consolidate the records temporarily into one place.
I'm sure there is a better solution, but I am not an FM expert at all!
Sorry, but according to TS people that I've previously discussed this with, you still have an increased chance of ending up with a corrupted file.
If your infrastructure does not permit hosting the file, you may want to contract with a service that will remotely host the files for you in exchange for a monthly fee.
For better data integrity, you really need to host the file and even with unique data in each copy, life will be much better if it is all entered into the same copy of the database.
Using your current structure, I would enter the search criteria into global fields and then use a script to do the same find on each file copy. I can then import only the records that match the search criteria instead of importing all records and then doing the find.
But you'll still find that other users are blocking access to files when they have it open...