There's another method also worth considering as it can be simpler to implement:
Save a clone of your file. (Save a copy with the clone option).
If you have not already sepecified these validation options, open the clone file and specify "Unique values", "Validate Always" on the key field that identifies duplicate records. Since you have already taken steps to prevent future duplicates, I am assuming you already now how to do this.
Then import all your records from the table containing duplicates into the clone. Duing import, the duplicate records will be filtered out. You can then either import all data from other tables into this clone copy or you can delete all records from a copy of your original file and import this cleaned up copy of your data back into the file from which it came.
Both of these were very helpful!
Try DuplicateFilesDeleter search it on the web , it also helps me to delete duplicates effortless.
Mark, the OP is not deleting duplicate files, but duplicate RECORDS in a database.
I tried the above script but in Filemaker pro 12 it doesn't look exactly as above
I get this error message when I run it.
"The previous script step, “Replace Field Contents”, could not be completed because of an error. Do you wish to continue with this script?"
I'm attaching a screenshot of how the script i made looks
Also, how do I change the settings mentioned to prevent loading future duplicates?