4 Replies Latest reply on Jan 30, 2012 6:41 AM by Padster

    Stop Duplicates from Importing



      Stop Duplicates from Importing



      Please forgive me as I am a Newbie to filemaker and am trying to figure things out as I go.

      I'm having a bit of a problem on an import process where I am importing one flatfile into multiple tables. For a quick breakdown, the file contains Customer Record, Customer Addresses and Customer Contacts. I would like to run this as a single import but I am first trying to elimate the problems that I am having with Import directly into the Address Table (this is after I have done the standalone import into the customer table which creates the first UID for the customer). The Address table must relate back to the Customer table with the FK for the Customer.

      On the address table I have built in the validation which stops you from manually entering an address (based on First Line and Postcode) that already exists. This is acheived through a self join relationship for the FK Customer ID, Address Line 1 and Postcode. From here creating a validation in a script on Submit to Count(Address 2::Address Line 1)>1

      In my naivity, I expected the Import process to be able to use the same process of validation as the manual entry if I added the above count() to the 'Validate by Calculation' on either the Address Line 1 or Postcode field in the Database MAangement. 

      On both occasions the import bottomed out with zero imports.

      Is there something that I might be missing or doing wrong? Any help is greatly appreciated.



        • 1. Re: Stop Duplicates from Importing

          Try this method:

          Define a new field in your addresses table:


          give it this auto-enter calculation:

          FirstLine & " " & postcode

          Set a unique values validation rule on this field

          Now you should be able to import addresses into this field but have the records with duplicate first line plus postcode values omitted from the import.

          • 2. Re: Stop Duplicates from Importing

            Hey PhilModjunk,

            Thank you for the responce

            I have had a look at what you have suggested. This works well for the Import, I have had to add the CustomerID to the concatenation calculation as this is part of my validation, to stop one customer having the same address, though different customers can have the same address.

            Though this solves the import problem, it adds a new problem to the manual input process that I had built into place. I had put into place a whole set of variable Custom Dialogs based on a potential duplicate addresses, ie 'This address already exists for this customer..', 'Do you wish to change or delete..' etc. All this is now skipped by the default 'Unique Record', 'Do you want to Revert'.

            Is there a way to script into an import process that changes the field to Unique and the turns it off when completed, or to set up custom dialogs to replace the system generated unique record windows?

            Again, thatnk you for you help.


            • 3. Re: Stop Duplicates from Importing

              It'd be nice if a script could change validation rules, but I can also see some major technical challenges to carrying out such changes since they modify "schema" and in a shared database, this can be dangerous to do.

              What you can do is modify your interface on the data entry side. If you are using FileMaker 11, you can use the OnObjectValidate trigger to catch the duplication and deal with it before the Unique values validation can trip and pop up a validation error message.

              You could also do a two stage import to filter out the duplicates. Import into a temporary table with the unique values setting to filter out duplicates, then import again to move the data into your current table--leaving all field validations as originally set by you.

              • 4. Re: Stop Duplicates from Importing


                I have now managed to get this working. The OnObjectValidate worked in this instance. I also incorporated the mass import that entered in the information into the correct tables.

                Now just to wait for the next hiccup..

                Cheers, Pad