7 Replies Latest reply on May 20, 2013 8:24 AM by taylorsharpe

    Slow performance

    k.hoornweg

      Try to find some help on this

      We are a commercial organization and using FM for more than 10 years.

      Currently we are using FM11 on a FM9 server with more than 20 users.

      Our solution consists of more than 60 files from simple to complex (over 1 mln records) and many many relationships.

      We are still develloping n a daily basis and all company processes rely in FM.

      More and more we experience slow preformance even in cas of opening a simple table.

      I did a lot of research ans know all the tips of avoid using uncalculated fiels, make indexes as much a possible, use blank layout outs, use max cache,etc, but even so, too often FM slows down. BTW, it is not a slowdown with a coffeecup but the Window system hangs.

      So i'm asking and looking for more tips and possible courses.

       

       

      An example of slow performance:

      In one file I have more than 50 tables. In this file there are more than 250 table occurences. Many of them are related to other files.

      I recently created a new table witn NO Uncalculated fields and all fileds indexed. The table (only 100 records) consists of the following fields).

      Customer_id (number)

      Product1_id (global)

      Product2_id (global) upto Product40_id

      Sales (number)

      Month (global/number)

      Year (global/number)

       

      I created 40 different productt fields as i would like to use the same reocrds for more users at the same time so any use can choose its own products to check the sales quantities for these customers

       

      This table is related to our SALES table which has over 1 mln records and located in anorher file.

       

      A created an update script to update the field SALES by using the global fields as i would like to be flexible in choosing the product_id's

       

      The updat script itself works properly within a reasonable time (approx 10 seconds)

       

      However to open the table i often experience a delay of 30 seconds or more.

       

      I open the table with the following script:

      Open new window

      Go to layout (blank)

      Show all records

      Sort (on customer_id)

      Go to layout (showing all the data)

      Go to records (first)

       

      So it looks like a simple proces.

       

      Could it be possible that the other TOC's and tables in this same file has any influence on this proces.

       

      I realize that there are many things that could slow the performance but i hope that some of you can bring me any further,

       

      Thanks in advance

       

      Kees

        • 1. Re: Slow performance
          gdurniak

          What Table is the "blank" layout based on ?

           

          show all,  and sort,  might take some time

           

          also, "40 different product fields ... to use the same records for more users" sounds odd. There may be a more elegant way to do this thru a related table

           

          otherwise, your solution is not of unusual size

           

          greg

           

          > However to open the table i often experience  a delay of 30 seconds or more

          • 2. Re: Slow performance
            FileKraft

            Hi Kees,

             

            yes a complex TOG can slow down your solution very much. try to prune the graph - remove TOs you don't need for the test you explained and you might see a huge difference in performance.

             

            if you upgrade to FMP12 you got more options by using ExecuteSQL to reduce your graph for some instances.

             

            also server 12 is much more performant then server 9 .. (but before upgrading test your converted multi-file solition if that is an option - the transition is

            not always as smooth as advertised).

             

            there are also webinars about how to improve performance for FM GO which of course are appicable to FM Pro.

             

            regards and good luck

             

            Uli

            • 3. Re: Slow performance
              manjit

              Hi Kees,

               

              Your script:

               

              Open new window

              Go to layout (blank)

              Show all records

              Sort (on customer_id)

              Go to layout (showing all the data)

              Go to records (first)

               

              The steps you are using here are simple script steps but are the most time consuming steps. As there are over 1M records "Show All" and "Sort" steps consuming much time. Apart from them while you are opening the New Window to do your operation that will reload the current UI screen which having many fields and its took time to load.

               

              Solutions:

              1. Script Optimization


              To fight with performance issue I would like to suggest you to take care of your scripts by optimizing them. As your database has been large in size you need to design your script in such a way that you will not open/load extra records than the records you want.

               

              For w\example in the above script instead of opening a new window directly and navigate into the blank layout what you can use is just use any related TOG which will a cross join/constant1 relationship so it does work of 3 steps in 1 step.

               

              GtRR -open in new window

               


              2. File/Datasource Management


              This is another steps where you need to lookup the unnecessary TOGs from other files and remove that. While we are opening a FM file the very first step the database engine do after validating the login credentials is to check for the data sources associate with the files and try to open them. So if you are having any TOGs that not in your current use still that file will open while opening current file. When an FM file get open the first window screen will get open and that ultimately load the records from the TOG on which the layout is based on. Here 1 more thing is that be sure that the very first window of all the files should be a layout with only single record table and with out any fields - to reduce file opening time.

               

              3. Cleaning of Database

               

              This is an important thing to make a better database. Sometimes we are doing database operation using few temporary external database and once we are done with the stuffs we disconnect those files or removed the files but we forgot to remove all the TOGs and data reference created for the operation. So while opening files or some layout while database try to access into those TOGs/file it will not find the same and its keep on searching for the files in local directory which slowdown the database and many times its hang the system too. So please take a look into you files and removed all those bad referenced TOGs and data reference from Data Source management option.

               

              Secondly sometimes we are having some calculation fileds which we are not using currently and having many indexed fields which are no more need for indexing. So if we remove that fields and indexing that also give us some faster execution of database process.

               

              4. Smart Backup Schedule

               

              We are usually do backup scheduling but what is smart backup? Smart backup is that where you can take a backup of all old records those you don't have any use but might have any future need. As here the database has been grown upto 1M records I think there have been several years of data exist in the database. In database there are many records that are transactional records like sales/invoices and lineitems which can be remove from your original database and can be stored into a file like a backup with a mark of Time stamp To|From. So whenever you need a record for those time perioad you can check into and look into that super backup database. This will save your time to find old data as well as make your database more faster for processing.

               

              These are all my experienced FM database issues for slower performance. I had worked and had a good result of optimization to my clients. 1 of my client was facing problem with 3/4 operations (my clients' had huge application of 64 files) which were taking 10-20mins to finished, you cannot imagine I had optimized the database and scripts to execute the process with in 10-12 secs.

               

              Hope this reply give some helps to you.

               

              with regards,

               

              Manjit Behera

              Software Developer, Mindfire Solutions, India

              http://www.mindfiresolutions.com

              email:manjitb@mindfiresolutions.com

              FileMaker11 Certified Developer

              • 4. Re: Slow performance
                k.hoornweg

                Dear Manjit,

                Great that you took the time to send me your detailed reply. I see that you are familiar with this performance problem. Let me go deeper and ask you some mote questions on this issue.

                 

                Concerning my script. What I do is this. I open table 1 (in file A) which consists of about 150 records by opening a new window. I do a search on this 150 records (e.g only show records of accourn manager John), The result is 50 records.

                Then I fill in the Month (global) and Year (global).

                In this file I have created 30 global fields (product_id 1, product_id 2 etc).

                In the next step I go to table 2 (in file B) to find the product_id's I need. This table consist of around 4000 records. I select this records with a find script. Question: what is better/faster? 1. Using enter find /perform find and sort or 2. creating a TOC with a preset sort.

                After the records are found I use a loop/go to next records etc to fill in the global product_id fields in table 1.

                After finishing the loop I return to file A.

                So after that I have the 50 records selected shown in vertical way and the 30 global fields in a horizontal way placed on the top of the layout above the first records. By doing that I have created a mattrix which i used to make in excel before by exporting from filemaker and so on.

                No come the tricky part which might cause performance problems, however in some cases it works quit fast (let say with 10 seconds) but in some cases more than 1 mínute (which is too long)

                I have created 30 TOC's to show the sum of the sales. The sales data is stored in table 3 (in another file C). This file consists of over 1 mln records (btw, thanks for you tip to backup old data in another file. I know that tip, but I did not effected yet. I need to make some extra development on that).

                To find the result of the sales data i do not need a sort (so there is no delay on that).

                I find this sales data by using the 30 TOC's.

                The TOC's are like this.

                Customer_id (file A) = Customer_id (file C)

                Month (gobal in file A) = Month (indexed in file C)

                Year (global in file A) = Year (indexed in file C)

                Product_ID 1 (global in file A) = Product_ID (indexed in file C)

                 

                So a created 30 different relations ships for each Product_ID

                 

                To display the sales data I have created 30 unstored numeric calculation field (SALES 1, SALES 2 and so on.

                Each field shows the sales data of each customer for each procuct for the selected period.

                So SALES 1 = sum(TOC 1::sales quantity), for SALES 2 = sum(TOC 2:::sales quantity) and so on.

                I use unstored calculation fields instead of stored fields because this function is used by different users at the same time. Each user should be able to select different customers and different product_id's and a different period.

                 

                This a more detaild explanation of my proces and I hope you can give me so more advise how to improve.

                 

                So I understand that when i go to table 1 in file A filemaker checks for all data sources associated with this file. In my cases over 200 TOC's associated with tables within file A but also may associated with other files, even with files from other databases (SQL).

                Does filemaker checks that every time I run my script?

                If so, i really have to do some extra development.

                Can i solve that by placing a blank layout as first layout for each file?

                 

                Last question about indexes. Does indexed on fields that do no need an index make the solution slower? I always make an index incase i need to sort or find or calculate on this field, maybe i have an overkill of indexes so please advise me when you use indexes and when not.

                 

                Thanks again for your time.

                 

                Kees

                 

                 

                Verstuurd vanaf mijn iPad

                 

                Op 20 mei 2013 om 08:22 heeft "Manjit Behera" <noreply@filemaker.com<mailto:noreply@filemaker.com>> het volgende geschreven:

                 

                http://www.filemaker.com/images/devrel/fmdev-logo.png<https://fmdev.filemaker.com/index.jspa>

                created by Manjit Behera<https://fmdev.filemaker.com/people/manjit> in Performance Tuning - View the full discussion<https://fmdev.filemaker.com/message/115979#115979

                • 5. Re: Slow performance
                  manjit

                  Hi Kees,

                   

                  Here are some responses to your comments below.

                   

                  In this file I have created 30 global fields (product_id 1, product_id 2 etc).

                  In the next step I go to table 2 (in file B) to find the product_id's I need. This table consist of around 4000 records. I select this records with a find script. Question: what is better/faster? 1. Using enter find /perform find and sort or 2. creating a TOC with a preset sort.

                  After the records are found I use a loop/go to next records etc to fill in the global product_id fields in table 1.

                  After finishing the loop I return to file A.

                  So after that I have the 50 records selected shown in vertical way and the 30 global fields in a horizontal way placed on the top of the layout above the first records. By doing that I have created a mattrix which i used to make in excel before by exporting from filemaker and so on.

                   

                  >> If your finding criteria/values are available in File A i.e. in Tab1 then no need to goto File B and search and sort the records, you can do the same things remain in File A no need to navigate to File B. Which can be done by following steps.

                   

                  Solution:

                   

                  1. You need to create a new TOG from Tab1 to Tab2 with the finding criteria lets say Tab1_Tab2__FindingAbc then create a related value list Product List which should be from the new relationship. This will give you the sorted list of values from Tab2.

                   

                  2. After then by the FM script you can get the value list values and set into a local variable in the script. Like as

                   

                  $ProductList = ValueListItems ( Get ( FileName ); "Product List")


                  3. After then you can loop through the list of values and set the values 1 by one to the globals of Tab1 which you want and that can be done dynamically as following.

                   

                  $NoOfProducts = Count(Tab1_Tab2__FindingAbc::ID) //This will give you the numbers of records found from Tab2.

                   

                  $Counter =1

                  Enter Find Mode[]

                  Loop

                  Set Field By Name["Tab1::product_id" & $Counter] = GetValue ( $ProductList; $Counter );

                   

                  $Counter =$Counter +1;

                   

                  Exit Loop If ($Counter > $NoOfProducts)

                   

                  Enter Browse Mode[]

                   

                  This script will faster as all the calculations will do through the variables as well as with in the same screen and same file. The change is mode is necessary as you are having many unstored field calculation that are depending upon the globals so when you are modifing the globals' values the unstored fields values are going to reload one by one and will take time.

                   

                  Hope this reply can help you much.

                   

                  with regards,

                   

                  Manjit Behera

                  Software Developer, Mindfire Solutions, India

                  http://www.mindfiresolutions.com

                  email:manjitb@mindfiresolutions.com

                  Skype: mfsi_manjitb

                  FileMaker11 Certified Developer

                  • 6. Re: Slow performance
                    LSNOVER

                    Not sure if this is relevant to your issue, but I saw your using a bunch of Global fields.  I have found that if you can substitute $$ or $ variable for Globals, it can make a big difference in performance.  If you have to use the Globals for data entry, try to get the values into Variables ASAP, and then update variables in your script instead of Globals.  In a loop with many repetitions it can be quite dramatic.  Cheers! Lee Snover

                    • 7. Re: Slow performance
                      taylorsharpe

                      I aggre, Lee.... plus I sure wish FileMaker had an option to have an input directly into variable instead of having to put it into a field.  In other words, when we have a Show Dialog, the input fields could alternatively be input variables for say $$Variable or whatever.  Keeping things in RAM instead of saving back to disk sure makes things go faster.