If you specify a field option on your email addresses field of "unique values" and "validate always". Date with duplicate email addresses will be automatically omitted during the import.
The same method can be used to remove duplicates from a FileMaker table if you import the data into a table with this option specified. You can then purge the records from the original table and import the cleaned up data back.
There is also a script example in the Knowledge Base (See link at top of this screen) that marks duplicate records so that you can find and delete the marked records.
Your last question looks like a case where you should import that list into a different table and then define a relationship to match the records--by email address I would guess. you can set up a recurring import that re-imports this data every time the spreadsheet is updated.
Another option might be to set up an ODBC link to that spreadsheet.
Thanks so much Philmodjunk...
I did the first part of establishing unique values and validating always. That should help omit the dupes going forward.
I'm a little confused with the second Q tho...I would like to match a spreadsheet of unsubcribes to the main DB (match records by email address) and then auto mark my field "Unsubscribe" with an "x"...is it possible to do that thru relationships? I can set up the new table and import into it, just confused on doing the recurring import part and how to auto-mark the field...thx again!
That automark could be an unstored calculation field that simply references the match field in the Unsubscribe table.
If ( IsEmpty ( Unsubscribe::EmailAddress ) ; "X" )
When you select Import Records | File from the File menu, note the check box "set up as recurring import". This option generates a script for doing the same import over and over again. It's not always the best option and the script it produces can be modified to fine tune things, but it's worth a closer look. There's a Help file entry labeled "Setting up recurring imports" that you might want to read through.
awesome thx again...this is a big help!