9 Replies Latest reply on Oct 28, 2016 10:03 AM by BruceHerbach

    Transfer data between tables efficiency

    ChrisCourtney-Stones

      Hi all,

      When moving data between tables which is the most efficient method, SetVariable, SetField or Lookup. For record creation via script it is a heck of a lot quicker to just look up the data but what are the performance issues when on mobile devices or remotely connected.

        • 1. Re: Transfer data between tables efficiency
          Johan Hedman

          Lookup are a technique that comes from older version of FM. I use Calculated value instead now, because then I can use a formula to set my field. Set Variable is good if you need to create a lot of records in a Loop.

          • 2. Re: Transfer data between tables efficiency
            coherentkris

            The most common reasons for moving data around are refactoring, creating reporting datasets, and denormalization.

            if you are refactoring it is a one time thing and export to file /import from file can be very effective.

            Creating reporting data sets are best done with scripting and can be low impact when PSOS or server scheduled scripts are used.

            Be very judicious with denormalizing data as a standard practice as every calculation adds a cumulative smidgen of processing overhead to the solution.

            1 of 1 people found this helpful
            • 3. Re: Transfer data between tables efficiency
              Mike_Mitchell

              coherentkris wrote:

               

               

              Be very judicious with denormalizing data as a standard practice as every calculation adds a cumulative smidgen of processing overhead to the solution.

               

              Agreed. Unless you're having performance issues related to sorting or finding on an unstored calculation based on a related field, copying data between tables isn't usually necessary or beneficial.

               

              What is the workflow here that needs the data to be copied?

              • 4. Re: Transfer data between tables efficiency
                philmodjunk

                +1 on Mike and coherentkris

                 

                The best way to move data from table to table or record to record is to modify your design so that you don't have to. If you can reference the data rather than copy it, you avoid any delay caused by copying the data and don't have issues with same data being redundantly stored in more than one location causing you to perform multiple identical updates when the data later must be changed.

                 

                That's not to say that there aren't very good reasons for having to make such data transfers such as:

                • "Synching" data from a mobile device back to a host file after new and updated data has been collected on the mobile device (including lap tops) when it has not been connected to the host.
                • Copying data to preserve a "snapshot" of the values at a specific point in time.
                • Copying data as a "starting point" save time during data entry but which is then modified and represents a new and distinct record.
                1 of 1 people found this helpful
                • 5. Re: Transfer data between tables efficiency
                  ChrisCourtney-Stones

                  Hi, the performance issues are what I am trying to determine.

                  Workflow:

                  Monthly contract service, new quotation required every month to allocate new Purchase Order and show any changes to the installation. Up to 200 products across 15-20 departments

                  From Quote Table - Produce Department and Department Items

                  From Dept Table - Produce Job Cards by Department and Department Items to be serviced

                  When performing a service, the service team may or may not be working in separate departments, the item being serviced is scanned and any parts used in the service are recorded. If an item is damaged or missing a replacement action is recorded or if an item requires a follow up a follow up action is recorded. When the service team are re-assembled the data is synced and a check performed to ensure all items have been serviced and a report is then produced for the client to sign showing any missing or broken items and any action reports recorded. Then comes the good bits.

                  Once the signature is captured the report with signature is emailed to the client and a check is then made for any replacement items required. If any are found the data against those items is captured and a new quotation is given to the client for the cost of those replacement. As we are required to replace those items during the service they are actually installed now so we have to get the client to accept the quote now while we are on site. The system then checks for any Action follow ups and those items are now captured and a zero value quote is produced and accepted under the old Order No and  new job cards are produced to tell the client that we will be returning in 2, 5, 7 or 14 days to complete the required action depending on the action report.

                  So the data being copied is:

                  Quotation to Department and Department Items

                  Department and Department Items to Job Card and Job Card Items

                  Job Card and Job Card Items to Quotation and Quote Items

                  Rinse and repeat

                   

                  Once all that is finished the system then duplicates the last contract quote and the service officer checks that to see if there are any changes required for the following month and gives that to the client to allow for the client to produce a new Order No.

                  • 6. Re: Transfer data between tables efficiency
                    ChrisCourtney-Stones

                    Hi,

                    As the materials being used are classified as dangerous every change to an item has to be recorded for legislative reasons so we have to a new record created for each item on every service which we can allocate the materials used and who did what and when, so it is the "Snapshot" element that is vital

                    • 7. Re: Transfer data between tables efficiency
                      coherentkris

                      if you need to capture all data changes for tracking/auditing purposes you should look at Ultralog by Nightwing NightWing Enterprises - UltraLog v2.x Audit Logging Demo for FileMaker Pro

                      1 of 1 people found this helpful
                      • 8. Re: Transfer data between tables efficiency
                        Mike_Mitchell

                        If a history is what you need, then lookups are fine. (I tend to prefer auto-enter calculations because they're more flexible, but the principle is the same.)

                         

                        There is no real performance issue with using lookups that isn't present with any other method. Whatever method you use will require the source and destination records to be present in the local cache, which is the issue when remotely connected. So long as both records are already present, then lookup vs. auto-enter calc vs. import is basically a wash.

                         

                        HTH

                        • 9. Re: Transfer data between tables efficiency
                          BruceHerbach

                          At one point I did test of methods to duplicate a data set with a hierarchy.  In this case it was for a painting company and the database was used to develop an estimate.  So the Hierarchy was Estimate, Area, Assembly, Task, Resource.

                           

                          The Estimate was the top level,  Area is a room,  Assembly is a collection of tasks required to paint the area and a Resource is a material being used.  ie Paint, paint brushes etc.

                           

                          The fasted method seemed to be export import.  Next was Night Wings duplicate Hierarchy demo. www.nightwingenterprises.com/demosX/demoX06.html

                           

                          As part of the experimentation/testing for this I found that setting fields to load new data from a variable seemed to be a very quick method of creating a new record set.

                           

                          These days,  you can use ExecuteSQL to pull all of the data into a variable.  Then in a loop go through the variable using Substitute to turn each line into a list creating a variable for each field.  Then use a New Record to create the record and populate all of the fields.

                           

                          If you look at NightWing's web site,  I believe there is an example of doing this.

                           

                          HTH