9 Replies Latest reply on Oct 6, 2014 12:00 PM by philmodjunk

    Slow performance

    miw

      Title

      Slow performance

      Post

      I have a relational database that prices inventory.  There are multiple tables as inventory pulls information from a few tables to determine costing.  For example, inventory components can be priced off of precious metal values (one table), MSRP (another table), fair value (another table) etc.  The user can select which value they would like to see in the inventory window and related inventory tables.  The calcs are mostly either unstored numeric calks or summary fields in the inventory table.  Once I get to a few hundred inventory items, the processing slows down dramatically.  I have eliminated unused calcs and summary fields.  Any suggestions on speeding up things?  I would be even willing to sacrifice immediate updates on inventory item valuation and total inventory valuation by some method to only calc on demand.  Any one have some good ideas to help?

        • 1. Re: Slow performance
          philmodjunk

          I'd need to have a more detailed understanding of your design.

          • 2. Re: Slow performance
            miw

            Please see the attached relationship table. The Coin table is where I store specific inventory based on year, mint mark and the coin's TypeDenom.  The TypeDenom table includes the coin's precious metal content for coming up with value based on the coin Type's metal value based on the Bullion type and value in the Bullion table.  Denominations are linked to TypeDenom and provide the face value of the coin; the Country table is also linked to TypeDenom - between these two tables, the face value of the coin in the TypeDenom table is valued at an exchange rate noted in the Country table.  In the Coin table, a collector value for the specific coin can be input.  The different values roll-up into the specific Coin in the Coin table via unstirred cal fields and/or summary fields.  I have a global selection button that allows me to determine values for each coin and the total of inventory coins by selecting metal value, collector value, face value at an exchange rate or highest value amongst these value.  There are numerous other tables linked to the Coin table but there are no calcs done because of them. 

            • 3. Re: Slow performance
              philmodjunk

              How is the layout designed? Is this a summary report with a column for each valuation calc plus summary fields to compute totals and sub totals?

              one analysis trick would be to make copies of this layout and use the copies to remove different fields to see which ones take the most time to update. 

              • 4. Re: Slow performance
                miw

                Essentially what you described in the first sentence.  The layout is designed with multiple tabs.  For instance, in the Country, Denomination and TypeDenom tabs, there is a tab that provides detail for each of the active record in that table.  Then there is a tab that summarizes for each of those table records the number of Coins belonging all the records in the table (for example, a portal that summarizes for each Country the total number of coins and their selected value for each country).  Their may be other ancillary tabs as well.

                • 5. Re: Slow performance
                  philmodjunk

                  I know of a way to speed up such aggregate calculations, but I have my doubts that it can be used with your data. It works best when used with data that is rarely, if ever updated after it is initially entered and it would seem that the rates used in your valuation calculations would be changing on a daily basis. The trick is to compute needed totals or subtotals in advance so that you do not need to compute aggregate values from a massive number of records. It might be possible to set up a script that runs after business hours that recomputes such totals--thus tying up your system at a time when you are not using it, but that assumes that such a system can either get the values it needs on it's own or after you input new rate data just before you quit for the day.

                  • 6. Re: Slow performance
                    miw

                    PhilMod:  I would definitely like to know what your method is to speed up aggregate calculations.  After initial data is input, rates used in valuation calculations do not automatically change during the day but can be set to run at night. Please let me know what you are thinking may be a good fix...

                    • 7. Re: Slow performance
                      philmodjunk

                      For each unstored calculation field, define a corresponding number field. In a script, Replace Field Contents or a looping script with Set Field can copy the value of the unstored calculation field to your number field. Set up your summary fields to summarize the number fields in place of the unstored calcs. Replace any unstored calculation fields on your report layouts with their corresponding number fields.

                      • 8. Re: Slow performance
                        miw

                        I was hoping to avoid creating a stored field instead of using all unstored calculations but your method significantly increases processing time.  But rather than a looping script as you suggest, I created a global field which can be toggled off or on or on demand - if "on", it will dynamically Replace Field Contents calcs immediately when record changes are made, if "off" it only Replaces Field Contents for the active record and related grand totals and if "on demand" it performs an update when requested of all records (the time consuming part) which is really only needed if certain methods of performing all calculations or a global assumption change are made.  So, for the number field which will have its field contents replaced with the unstored calculation value, the formula to replace the field contents first has Global toggled field to determine if the contents get replaced dynamically (slowest as the calcs are performed for everything, even if variables haven't changed), only for the active record (fast and accurate as it only updates the active record plus the grand totals so long as no calculation methods or global variables change) or only on demand (fastest but will show stale calculations until executed).

                        • 9. Re: Slow performance
                          philmodjunk

                          The original concept is for a scheduled script to do the processing during "down time" so that your searches and sorts no longer need to reference unstored/unindexed fields and summary field/aggregate calcs likewise can reference the stored values.

                          It's only "faster" in two respects:

                          It does the "slow part" when no one is watching.

                          It does a lot of the needed "number crunching" just once rather than every time a report is pulled up.