4 Replies Latest reply on Jun 16, 2016 3:35 AM by brianj77

    Duplicate Values

    brianj77

      Hi

      This questions seems to have been asked a lot in some fashion. However I can't seem to find an answer to my specific variation on the question.

      There are many posts and discussions on finding and identifying duplicate values: pretty straightforward with lots of options and solutions

       

      Finding the records with duplicate values is OK but managing them seems to be something different. I can filter the duplicate values but want to delete the 'spare' copies retaining 1 copy of the record. Currently all the solutions seem to imply that I want to delete all records.

       

      If have 3 copies of a record I want to delete 2 but still keep one. Am I being dumb and missing something? (very possible and, indeed, likely).

      I cannot find a solution to this. Has anyone got ideas about how to solve this?

       

      Thanks in advance for taking the time to read and respond.

       

      Brian

        • 1. Re: Duplicate Values
          dtcgnet

          A little of this depends on how many duplicates you're dealing with, how they were created, and things along that line. If it's thousands, then you'd better have a solution which uses a programmatic option. You could do something with self-join relationships in order to number the duplicates.

           

          If it's a fairly small number of duplicates, it's often easy enough to find the duplicates, sort them so that the duplicated fields are together, then just go through and manually delete duplicates.

           

          So...how many are you dealing with?

          • 2. Re: Duplicate Values
            brianj77

            Hi

            Thanks for replying. I have a small example and a medium example.

             

            1. Record set up of 400 with 25% having two or more copies of the same data (1 field: email address)

            2. Record set of 40,000 with 25% having anywhere between  2 and 15 copies of the same data (1 field: email address)

             

            Origin of data

            Example 1 came from merging older lists, but the merge was badly executed by someone using a spreadsheet who didn't really understand managing data at the simplest level

            Example 2 cam from a business with several people in one dept accessing a Sales/CRM facility and not having strict enough data entry policy. As usual we inherited an exported spreadsheet on the assumption that we can just fix it with a magic 'fix' button.

             

            I should also say that my Filemaker skills are not that advanced. I would say I am an 'advanced beginner and my time developiong the little I do in filemaker is very sporadic.  So the 'programmatic' (ie anything beyond the simplest of scripts) answers tend to be a challenge to me

             

            Thanks

            • 3. Re: Duplicate Values
              dtcgnet

              In cases like this, "keep it simple" probably applies most. There are all sorts of ways, but given what you've said, I'd probably use one of two scripting methods. Again...there are other, faster ways, for sure.

               

              I'd use the "!" operator to find duplicates in the email address field. Just go into find, type an exclamation point into the field, and hit find. Sort by email address. Then run a script that does the deletions.

               

              PERFECT ON A BACKUP FILE FIRST.

               

              Your script could look something like:

              Go to record (first)

              Loop

              If emailaddress = GetNthRecord ( emailaddress ; Get ( RecordNumber ) + 1 )

              delete record (no dialog)

              else

              go to record ( next ; exit after last )

              endif

              end loop

              exit script

              • 4. Re: Duplicate Values
                brianj77

                Hi

                My apologies for the apparent late reply. I got caught up in another mission.

                This worked a treat for me.

                Thanks for your help

                Brian