You are replicating data across potentially 3 tables - the original file, and the two other tables. Sounds like warning bells of 're-think the design'. It is unusual for the same (or even slightly different) data to be in several places without related tables being a better solution.
But you can import the temporary records directly into the second table. Set the Key field to validate always and be unique, and when you import all the records FM will seamlessly reject any that match an existing key field value.
For the first table, I think you should be importing by 'Matching records'. That would probably save the find-copy-paste steps.
thanks for the reply,
I think I can skip the first temporary table, but this is not the part that take a lot of time.
I have two tables because the description of records I am importing is changed and saved in a different table, so I have to replicate the new records key, and default description, in the secondary table.
The main table could have many costumers with the same articles, the second table has one description for each single article and provide and preserves the right description of the articles.
So I have to set all the fields in the main table and add missing ones to the support table when it is needed, so I am checking the existance of each record.
Take a look at setting up an Import Records Import with the Update Matching Records in FoundSet | Add Remaining Data as New Records options specified. This may allow you to do this without a looping record.
I have FM11 Pro Adv, can I do that?
I have the import records available in my scripts but I don't know where exactly found or do the other things you have listed.
Could you give me some more details?
This can be done in FileMaker 11. These are options you can select when you go to that step in your script and open the field mapping dialog.
thank you! I will check these options too then! Thanks
Another trick worth evaluating is to set up a table where you have a field that contains duplicate values in the data to be imported, but for which you only want one record for each unique value in the table, with a "unique values | Validate Always" set of validation field options.
Then, when you import into this field, records that match existing values in the table are automatically filtered out and you get one record for each unique value.