10 Replies Latest reply on Sep 28, 2016 11:48 AM by monkeybreadsoftware

    Coping data between tables using PSoS

    ndveitch

      Hi there,

       

      I need to find out what would be the best way to copy data between 2 tables using the PSoS. I have a lines table that I want to copy the last 12 months wroth of data to a temp table for reporting. The reason for this is because I have a ton of calculation fields to get the values for the individual months, such as the date headings,  amount values per month and so on.

       

      I originally had it that the script would go to the lines table, find the last 12 months worth of records for whatever date the user requested, and then go to the temp table, remove any records that the user might have previously imported, and then import a fresh copy to the temp table. This works, but to import 1000 records takes anywhere from 25-45min. As you can imagine this is not really ideal. So I decided to try use the PSoS only to find out later that you cant import from FM to FM when doing it on the server. So I tried CSV, but had some trouble with the import as it wasn't importing to the correct fields. Then I tried using variable lists of the key fields, going to the temp table and creating new records using the lists and having the database update required fields via relationships but that also took some time. Now I am using export records to xlsx, then import records using the same path. It seems to be working but is still slow.

       

      This is the script step that calls the PSoS. The zg_Date is the global date that the user is asked to enter the end date they require.

       

      Perform Script on Server [ Wait for completion ; “Catches by month - Server” ; Parameter: Get ( AccountName )  & ¶ &  Catches::zg_Date ]

       

      This is my PSoS script that is called when the report is run.

       

      Set Variable [ $Name ; Value: GetValue ( Get ( ScriptParameter ) ; 1 ) ]

      Set Variable [ $Date ; Value: GetValue ( Get ( ScriptParameter ) ; 2 ) ]

       

      # Using the $Date variable, go to the Lines table and search for the required date range for the last 12 months up to the global date

       

      # Create variables to be used for the date search in the Lines table

      Set Variable [ $Var1 ; Value: Date ( Month($Date) ; Day(1) ; Year($Date) - 1 ) ]

       

      Enter Find Mode []

      Go to Layout [ “All fields Received Lines” (Received_Received Lines) ]

      Set Field [ Received::Date ; $Var1 & "..." & $Date ]

      Set Field [ Received::Stock ; "==Equipment" ]

      Set Field [ Received::zc_sort ; "==Boxed" ]

      Perform Find []

       

      # Should there be no records, exit the script and pass the result so the calling script knows there was an error

      If [ Get ( FoundCount ) = 0 ]

      Exit Script [ Result: "No Records" ]

      End If

       

      # Setup Path for Export File

      Set Variable [ $Path ; Value: Case (   Get ( SystemPlatform ) = -2 ; "filewin:" ;  ""  )  &  Get ( DocumentsPath ) & "List.xlsx" ]

      If [ Get ( LastError ) <> 0 ]

      Exit Script [ Result: "Var" ]

      End If

       

       

      # Export to excel

      Export Records [ No dialog ; “$Path” ; Unicode (UTF-16) ]

      If [ Get ( LastError ) <> 0 ]

      Exit Script [ Result: "Export" ]

      End If

       

      Refresh Window []

       

      # Import from excel export

       

      Enter Find Mode []

      Go to Layout [ “All Fields Catches” (Catches) ]

      Set Field [ Catches::UserName ; $Name ]

      Perform Find []

      If [ Get ( FoundCount ) > 0 ]

      Delete All Records [ No dialog ]

      End If

      Import Records [ No dialog ; Source: "$Path"; Worksheet: "" ; Add; Unicode (big endian) ]

      Replace Field Contents [ No dialog ; Catches::UserName ; $Name ]

       

      If [ Get ( FoundCount ) = 0 ]

      Exit Script [ Result: "No Records" ]

      End If

       

       

      Exit Script []

       

      From here is goes back to the calling script. This is the first time I am playing around with PSoS, so I am not sure if i have done it right, but it still takes a long time. Any ideas on what I should look for to see why it is taking so long to import?

        • 1. Re: Coping data between tables using PSoS
          philmodjunk

          Try exporting to a Merge file. This includes the field names as column headers and then you can use matching names as part of the import.

           

          But the CSV option should also have worked here as you can manually align the columns of CSV data to FileMaker fields as part of the import records scripts step.

           

          Note that server side exports/imports must also be via a very short list of allowed locations on the server as well.

          1 of 1 people found this helpful
          • 2. Re: Coping data between tables using PSoS
            David Moyer

            Hi,

            how many fields are you importing?  Are the fields in your temp reporting table indexed?  Are there calculations in there?

            Turning indexing off might speed up the import a bit.

            1 of 1 people found this helpful
            • 3. Re: Coping data between tables using PSoS
              ndveitch

              Maybe I should try the CSV again, as only after playing around with the excel export, was I getting the fields to line up. When I tried the CSV the first time, only the one column was getting imported.

               

              i will try the Merge file and see if that works.

              • 4. Re: Coping data between tables using PSoS
                ndveitch

                The main reason I want to use a temp table, is because I have a bunch of calculation fields and I didnt want to add more calculations to my receiving lines table, as this is really only going to be used when the user wants to see the last 12 months and I thought having a bunch of calculation fields in the table that are only needed every now and again would put unnecessary load on the DB.

                 

                I am importing 14 fields, but the 2 key fields that I import are linked to the respective parent tables to update about another 10 fields. Then based on the date field, I have 36 fields that are used to display the data according to the date, one set of 12 for the column heading per month, one set of 12 for the value amount sold per month and one more set of 12 for the quantity amount per month.

                • 5. Re: Coping data between tables using PSoS
                  philmodjunk

                  If these are unstored calculation and/or summary fields, they will not put a load on your DB except when you pull up a layout that contains them. You can further reduce the "load", if, on layouts where you put these fields, you take steps to keep the found set limited--such as removing "show all" and "show omitted only" from the menu plus limiting the user to your own scripted find options.

                   

                  If they are stored calculations, there is some added load due to the increased width of your table. Anytime your client fetches a record from the table, it fetches all the fields and thus these stored values represent more data being fetched with each record. You can reduce that load if you put those calculations into a related table so that they are only fetched on layouts that reference at least one of those fields--of course this can turn your calculations into unstored calcs so you may gain some benefit on some layouts and lose it on others where you'd now have to wait for the calcs to update on each record.

                  1 of 1 people found this helpful
                  • 6. Re: Coping data between tables using PSoS
                    David Moyer

                    not that this will speed up your reporting time, but here's a technique I've used for reporting (simplified) ...

                    - the temp table contains a key_Number field which is related to the main table's primary key

                    - only the primary key is imported into the temp table

                    - the temp table contains as many text fields for export as columns required for the report

                    - the temp table contains a number field called SortOrder (used to arrange the rows for output)

                    - a script populates the text output fields, iterating through each record in the import

                    - headers, trailers, summaries and blank lines are all created and populated via the script

                    - the fields are exported to .xlsx, because my customer's loved Excel

                    1 of 1 people found this helpful
                    • 7. Re: Coping data between tables using PSoS
                      ndveitch

                      Thank you so much for the advice. Looks like I am going to need to go and have a re-think about my table and script.

                      • 8. Re: Coping data between tables using PSoS
                        philmodjunk

                        If your report uses summary fields to compute subtotals based on different groups of records, you can sometimes produce dramatically faster reports by setting up a summary table with pre-calculated subtotals. The summary table is created by a script and has one record for each sub-total with the subtotals stored in simple number fields. This works best for data that is rarely if ever updated after it reaches a certain "status" such as sales data after the transaction has been finalized (paid for).

                        1 of 1 people found this helpful
                        • 9. Re: Coping data between tables using PSoS
                          coherentkris

                          in addition to the excellent advice already given you should consider virtual list reporting if you are going to rethink things.

                           

                          https://filemakerhacks.com/2016/04/27/virtual-list-reporting-part-1/

                          1 of 1 people found this helpful
                          • 10. Re: Coping data between tables using PSoS
                            monkeybreadsoftware

                            You can use FM.SQL.InsertRecordsToSQL to do a SQL select in one table and insert it into another table.

                             

                            See examples here:

                            Monkeybread Software - MBS FileMaker Plugin: FM.SQL.InsertRecordsToSQL

                             

                            Can copy from one table to other table. They can be in different files. And the field names don't need to match.

                            1 of 1 people found this helpful