Why not save a copy of the database once in a while and delete records inthe working file? Even better, don't do this at all. There's really no reason to do it. FileMaker can handle huge data sets.
FileMaker can handle huge data sets, but certain functions, especially searches and sorts can "bog down" with larger tables. While this is much less of an issue with current hardware and software, there can still be advantages to moving certain types of data into an archive file. A script can perform a find to pull up found sets of records to send to another file and then use Import Records to import the data from a Table in one FileMaker file into an identically structured table in the other. If you set up an external data source reference to that that archive table in your main file, this script can do the find and the import all from your main file.
But I agree with Rick that this is not an option you may not actually need to use for your database given the other options available today.
Dear Rick, dear PhilModJunk,
I thank you both for your helpful comments.
It appears that there is no "easy just click-it" solution to what I wanted to do.
The scripting doesn't sound all that simple to me.
Also, you maybe be correct and it may not be truly necessary to weed out old data sets.
I'll think about it some more and possibly may ask you again sometime in the future.
The scripted solution can be exactly that simple for the user and isn't really all that complex to set up as a script either. In a solution of mine that does this, no one even has to click anything. A task scheduler controlled robot file performs this script for us every evening after close of business.
So you are implying that it may be easy enough for me to try and script that?
You have been very right in the past, so I will give it a try on one of the upcoming weekends :-)