1 2 3 4 Previous Next 52 Replies Latest reply on Mar 20, 2014 6:50 AM by beverly

    Weight of comments and disabled script lines

    siplus

      Set iterations to at least 50'000.

       

      Run.

       

      Discuss.

        • 1. Re: Weight of comments and disabled script lines
          siplus

          try this alternative, too.

          • 2. Re: Weight of comments and disabled script lines
            siplus

            What I'm trying to demonstrate is that

             

            for the - I hope - few of us that must absolutely run time-dependent / time-critical scripts on a large set of records,

             

            • it's better to avoid comments in the script (use a twin commented script with "doc only" in the name)

             

            • if you want comments, it's better to use a disabled Let() than a real comment...

            • 3. Re: Weight of comments and disabled script lines
              mikebeargie

              As you set it up I got 2742, 3498, 3071

               

              HOWEVER, you are using the Random function, which I think might also have suttle nuances in timing.

               

              I removed the random part in a copy, (just set $whoCares to 10000, instead of Random * 10000) and got 2669, 2533, 2405.

               

              Also, I added a third option, that leaves the random factor in place, but adds a refresh that clears external data before the execution of each script. With that refresh, I now get 3050, 3334, 2795.

               

              There's could be a lot dependant on your system itself, and I'd imagine the RAM and processor allocation between FM/OS/other apps varies wildly during any given runtime. Just for kicks, here's your original script run 5 more times:

               

              3270,3749,2796

              3130,3281,2748

              3001,3238,2742

              3025,3252,2784

              3070,3306,2799

               

              If I quit all apps except FM and Chrome, I get:

              3030,3215,2752

              3069,3301,2733

              etc... so it appears that Mac OSX and FM are pretty decent at handling processor/RAM affinity.

               

              Attaching my modifications.

              • 4. Re: Weight of comments and disabled script lines
                mikebeargie

                Personally, running 5 * 50,000 for the comments with only a variation of <1 second over no comments doesn't seem like a big deal to me.

                 

                That's 250,000 comment line reads, and only <1 second variance. Meaning that in an extra minute, you could get 15,000,000 comment line reads (assuming no time lapse degredation, which I HAVE experienced in FM for long running scripts).

                 

                My final thought is that if your script is long enough, or run with enough frequency that it makes a difference, by all means separate it into a production only copy and optimize the hell out of it.

                 

                Otherwise, for all 99.9% of my practical scripting purposes, comments assist more than they detract.

                 

                Here's a good read as well:

                http://xkcd.com/1205/

                 

                For instance, if you have a script you run once daily, and optimize it to one second faster, then you save 30 minutes over the course of five years. Maybe it's not as big of a deal?

                • 5. Re: Weight of comments and disabled script lines
                  mikebeargie

                  PS - if it IS a big deal, man, what the heck are you doing with FileMaker? Most people going off into that territory / need are working with much faster DB platforms.

                  • 6. Re: Weight of comments and disabled script lines
                    siplus

                    Dear Mike,

                     

                     

                    there is a silent minority out there which stretches Filemaker's functionality and competence into new territories on a daily basis.  I think I'm part of it - we've set our bets on FM many, many years ago and we like to think that anything, everything can be done with this wonderful piece of software. As much as we love it, we do explore its boundaries, limits and possible improvements -  this discussion is an example. Personally I base my living on Filemaker, on delivering and covering what our clients are asking for. In a world that wants everything, immediately we do have to fit and optimize our solutions; if some ideas and valid points crystallize and gain momentum, catching Filemaker's own architects' interest, everybody has something to gain, that's the whole point.

                    • 7. Re: Weight of comments and disabled script lines
                      siplus

                      And by the way, Mike, what's your take on this ?

                       

                      (Hint: hit RUN more than once)

                      • 8. Re: Weight of comments and disabled script lines
                        wimdecorte

                        No special take on this...  A better test would be to run these 3 things sepearately after each time quitting out of FM to get rid of any cached data.

                         

                        What it demonstrates is what has been dicussed many times, including at Devcon: ExecuteSQL() is not always faster than other native FM mechanisms.  What was your expectation?

                        • 9. Re: Weight of comments and disabled script lines
                          keywords

                          Interesting test. I'd never really thought about the possible impact of comments or leaving disabled lines in place. As mike points out, not a lot in the grand scheme of things but nonetheless worth exploring a little. To speed things up a bit I found 10,000 was plenty of loops to show a discernable difference. My first observation is that the scripts never execute in exactly the same time. I tried shuffling the order but that was not a significant factor. Then I tried making all three scripts identical and there is still an execution time difference. How come that?

                           

                          Anyway, for what it's worth, here's a series of tests I ran, with the time result for the altered script only:

                           

                          Test 1 ($whoCares variable disabled, #comments in place):  time—486

                          Test 2 ($whoCares variable disabled, #comments in place, add SetField () step after each comment):  time—811

                          Test 3 ($whoCares variable enabled, #comments in place):  time—941

                          Test 4 ($whoCares variable disabled, #comments disabled):  time—518

                          Test 5 ($whoCares variable disabled, #comments removed):  time—453

                          Test 6 ($whoCares variable removed, #comments removed):  time—435

                          Test 7 ($whoCares variable removed, #comments increased from 5 lines to 20):  time—752

                           

                          Conclusions:

                          (1) a disabled line of script has an impact on execution time, albeit small, so while we disable code during script development it's best not to leave disabled code in place when you are sure it's not needed.

                          (2) comments also have an impact on execution time (interesting that they have an even bigger impact if disabled!), so use them to the extent that they are needed but be aware that they are slowing script execution, albeit infinitessimally per line, but collectively it becomes significant.

                          • 10. Re: Weight of comments and disabled script lines
                            jbante

                            Optimizing applications and pushing FileMaker's limits are both worthy causes, but there's value in keeping the bigger picture of any optimization in mind before making recommendations.

                             

                            A sophomoric approach might demand that we start by identifying the bottlenecks in our current solutions so we can focus our efforts on improving the most problematic spots, and only turn our attention to optimizing non-bottlenecks when we've exhausted our imaginations for alternative solutions to the bigger problems — optimize our optimization efforts. You must have done a lot of optimization work if comments and disabled script steps are the most fruitful place you have left to search for speed improvements! I imagine your results from some of that other work might have broader appeal, and I'd love to see it on the forums.

                             

                            A not-much-wiser approach would suggest that we start by understanding and balancing the priorities of a solution — figure out what needs to be optimized. You appear to have found an example of the infrequent scenario where improving execution speed and improving clarity for maintenance developers might be at odds with each other. We can't make general recommendations based on your findings without understanding the priorities of the solution that recommendation will be applied in.

                             

                            I don't mean to discourage the effort here. This is fascinating and useful stuff. Being strict about either approach I mention here can easily lead to pathologies similar to statistical over-fitting or over-financialization of business management. Cost/benefit analyses are most helpful when we have good reason to believe that the costs and benefits of our actions and strategies are predictable, which is often not the case in software development. (However, analyzing performance bottlenecks does show us the ceilings on how much benefit might be possible from which optimization.) So, let a thousand experiments bloom!

                            • 11. Re: Weight of comments and disabled script lines
                              wimdecorte

                              keywords wrote:

                               

                               

                              (2) comments also have an impact on execution time (interesting that they have an even bigger impact if disabled!), so use them to the extent that they are needed but be aware that they are slowing script execution, albeit infinitessimally per line, but collectively it becomes significant.

                               

                              Ok, time to fight this because it seems to be building up to a recommendation to avoid commenting code.

                              Collectively they DO NOT become significant because you can not measure the impact of code commentting purely by its execution time.

                              For 99.9% of the solutions out there, avoiding comments or disabled code is going to fall in the "Premature Optimization" category (http://c2.com/cgi/wiki?PrematureOptimization)

                              As Jbante mentioned, you would have to have exhaused all other optimizations before you would turn to avoiding comments and disabled code.

                              Without that proper perspective this thread may lead developers to avoid comments and lead to obscure code and wasted development time and potential for errors due to obfuscation in code maintenance.

                              • 12. Re: Weight of comments and disabled script lines
                                keywords

                                I am not against commenting code AT ALL! In fact quite the opposite. And I firmly believe that there is far more to it than execution time. Commenting scripts, calculations and field definitions adds far more value than it costs. It's just interesting to note that it does actually have a cost, albeit miniscule, that's all.

                                • 13. Re: Weight of comments and disabled script lines
                                  mikebeargie

                                  I base my living on FM as well, as do some of the heavy hitters that have commented below. But I also have coded previously, and still dabble, in other platforms enough to not be naive about what FM's limits are. There's many great uses for FileMaker, but performance driven applications on massive data sets with copious amounts of calculations while expecting great speed is not necessarily one of them. I am very much a "right tool for the right job" developer. I also know that client relationships will eventually deterioriate if you sell someone FM when it's not the right solution for them.

                                   

                                  Take for instance another technology I've played around with a bit: hadoop. Hadoop is a crazy awesome platform, hive computing delivers insanely fast results (like google as-you-type search results). The ability to rapidly scale from one to one hundred, or one thousand machines to distribute and speed up the workload is nothing short of a miracle of modern computing. I can take terabytes of data, and run calculations and queries against it pretty much in realtime. But I don't see that sort of application as EVER being done in FileMaker. Yes, FM has a record limit of 64 quadrillion records, but that doesn't mean you should go index the text of every book in the library of congress to make a new catalog. It's just not the right tool.

                                   

                                  Everyone explores the boundaries of FileMaker in their own way. I don't want to sound like a jerk but you're not in a minority of developers out there. There isn't a dev I've met that says "my solutions are fast enough", or "I think I've learned everything I need to know". Part of the appeal of FM is that even novice developers can build powerful solutions in no time. A junior level developer I trained at my last job is now using techniques and strategies that some 20 year veterans I've met wouldn't be able to wrap their head around, and he's only two years into ANY programming platform.

                                   

                                  As Jeremy suggests, most developers here may not even consider comments an issue. As Wim suggests, trying to introduce a coding standard that you should either forego commenting or keep double copies of scripts for performance might also be dangerous (I'd actually bump up Wim's number to 99.999%). As I demonstrated on your original test, slight modifications can wildly change the performance (IE, removing the random part made the second test WITH the comments FASTER than the first one without).

                                   

                                  If you still want to continue, you really need to thoroughly think of how you can truly isolate and test for performance to see where the net gains would be. As Wim stated, execution time may not even be a valid measure of true performance. I also want to echo Jeremy's thoughts of not trying to discourage this, even though my long comment here may touch on ranting a bit. If you're trying to introduce such a drastic measure to eek out the tiniest bit of performance, while trying to gain community support for a coding practice AND the attention of the FM architects, then you need a much more thorough test, with thousands of iterations and hundreds of testers to prove your point. Not just a casual file that was thrown together and introduced on technet.

                                  • 14. Re: Weight of comments and disabled script lines
                                    gdurniak

                                    Thanks for the Tip

                                     

                                    If I ever need to shave a few seconds off a huge loop, I will try it

                                     

                                    However, I would be more interested to know WHY this might happen, e.g. scripts are interpreted, or compiled just in time

                                     

                                    This would make a good Under the Hood Topic at DevCon

                                     

                                    greg

                                     

                                     

                                    > What I'm trying to demonstrate is that

                                     

                                    for the - I hope - few of us that must absolutely run time-dependent / time-critical scripts on a large set of records,

                                     

                                    • it's better to avoid comments in the script (use a twin commented script with "doc only" in the name)

                                     

                                    • if you want comments, it's better to use a disabled Let() than a real comment...

                                    1 2 3 4 Previous Next