Delete All Records is misnamed. It really deletes all records in your current found set. Your script looks fine for what you want to do.
Some further thoughts:
Some developers never delete records but just mark them as you have. That leaves the records present in the table and they can be "undeleted" by changing the value in this field. The records will appear deleted to users because scripts, relationships, portal filters, etc are set up that automatically hide the deleted records, keeping them out of found sets and portals.
Back up your files. Back up your files. Back up your files. Back up your files. Always have a system of automatic backups set up, keep lots of these back ups on file, Store them in more than one physical place. That way, you can use them to recover from your design mistakes--such as a script that deletes the wrong records or that modifies large numbers of records incorrectly and also from user mistakes--such as editing a record incorrectly and not discovering the error until quite a bit of time has elapsed.
Right, I made sure it only deleted found sets before implementing it but I wanted to make sure I wasn't missing something that would cause other records to be found and then deleted.
The no deleting records is a great idea. I guess there would just be so many useless records that I thought simply deleting them would be the cleanest.
As for backups ... the only thing I haven't implemented is off site backups. I'm trying to come up with the best way to go about that. What I do have in place is a server that backsup to a raid 1 unit that mirrors the data to two drives. so I'm fairly confident with that setup. Oh and these backups are time machines so I can go back and view files as they were before!
One solution I thought of would be to buy two more of these raid 1 rigs, plug in offsitebackup1 on a monday while offsitebackup2 sits at home. On friday I take offsitebackup1 with me home. Then monday morning I plug in offsitebackup2 when I get in. haha maybe a little over kill to have four drives showing the same thing at any given time??
What I do have in place is a server that backs up to a raid 1 unit that mirrors the data to two drives. so I'm fairly confident with that setup.
That's excellent for most possible disaster scenarios but if your building burns down or a burglar steals the entire server....
Swapping out back up drives certainly meets the criteria for keeping data off site, but it would seem far simpler to set up some kind of "cloud" based backup where the data is regularly backed up via the Internet to another location--and there are companies that offer that kind of service.
these backups are time machines so I can go back and view files as they were before!
Great! but make sure that your time machine backs up the backup copies from the backup folder. Backups made from FileMaker files that are open at the time the copy is made by 3rd party backup software could be damaged by the backup process. And file damage is not always immediately detectable.
We work out in the booneys, and the best upload speed I can get is between .3mbps and .7mbps. Always figured it wasn't worth looking into it because of that but I suppose after the initial big upload it wouldn't be so bad unless we made major changes.
As for copies of the backup folders, I've looked into this and from what I gather, time machine covers everything, including system files etc. so I assumed everything else was safe on that front as well. I have heard of some products not backing up live databases etc. but again, if I understand correctly it's like a snapshot of your computer at that one moment. Maybe have a look at this?
Yes, but when you use time machine to recover, the copies in the folder from which server hosts the files will be copies backed up from open files and thus may be damaged. I'd discard the copies of the FileMaker files in the folder from which server is hosting the files and replace them with copies from the most recent back up from the backup folder.
I have my active FileMaker files on one drive. Each time a file is closed, using a closing script, the file saves a copy of itself onto another drive. I set Time Machine to exclude the active files on the first drive and back up the files on the second drive. That way Time Machine never backs up a file when it's in use and I still get hourly backups of the copies on the other drive. Database files write to disk a lot. The danger is in Time Machine attempting to back up a file when a disk write is happening.
So the damage can be done to a backup file or to the active file?? We don't ever close our database because it's shared and there's always someone working in it.
Last year, I specifically asked a TS person from FileMaker about this due to some confusing comments abou this issue that had appeared here in the forum. His answer was:
Only the copies made from the open file might be damaged.
What I am pointing out is that you need to make sure that TIme Machine does not use back up copies made from open files to "restore" your system or if it does, to then replace those copies with backup copies that were not made from open files.
ok so I don't run the risk of damaging my active files from a time machine kick starting. Simply that I can't reply on an active file in a time machine should I need to pull up an old copy, correct?
If by "active file" you are referring to an open FileMaker file, you are correct. Since you can't rely on it, there's no point in backing up from it in the first place.
ok sounds good. Thank you!
In light of this conversation I'm wondering which scenario you would suggest. I'm now torn between simply getting a copy of Filemaker Sever and be done with it, this way I can backup all our live databases and not have to worry about work arounds. If I understanding correctly, all I would need to do is exclude the folder that contains all my open databases, set FileMaker Server to back up on the hour to a backup location that is included in the time machine list and I would be set?
Second idea that came to me which could potentially bypass the 300/yr volume licensing fee we currently have in place, would be to create a second version of my live files and add "_backup". All I would then do is have a scheduled task that opened the database, logged in and then tries to close it. I would then put in place a step when closing this file in the open and close tab that would save a copy as. Could this work? I've never created a scheduled task in my mac before so I would be navigating uncharted territory, if someone knows of any tips or where I could find tips on-line it would be a big plus.
If that doesn't work then I suppose there could be a daily backup done of the open file in the middle of the night when no one is at work.
Thoughts and opinions much appreciated!
300/yr volume licensing fee we currently have in place
Small clarification, we do not have FileMaker Server as part of our volume licensing agreement. Kind of came off that way. We simply have a bunch of FileMaker Pro copies currently on volume licensing which would make us eligible for the server licensing should we need to go down that route.
Using FileMaker Server will be much simpler to set up, but more expensive as you will need to purchase and install server. You'll also want to review the system requirements for server to make sure that you have a server computer set up that can handle it.
It will also provide other features that make it easier to work with and manage your hosted databases.
But using a "robot file" run via an OS task scheduler and located on the host computer can use Save A Copy as to generage back up copies of your open files. And this does not require a second copy of your database files to implement.
To see an example of a script that uses Save A Copy As to generate back up copies see this thread: Saving Sequential Back Ups During Development
(You wouldn't need the install OnTimer Script part of this, just the part that uses a variable and Save A Copy As to save a copy with a specified name.