I appreciate your response, but unfortunately, these instructions demonstrate how to calculate the timing of a script. (This is more common than trying to calculate the timing of a calculated field).
The good news is that I found a way to solve my own problem. It doesn't require any scripts, and it doesn't require any custom functions, so it should translate well to older versions of FileMaker Pro. -- However, I am using one of the newer Get functions, which includes milliseconds, so these instructions might have to be modified for FileMaker Pro versions earlier than version 13.
Anyway, for those who are interested, here is the solution I came up with (tested successfully):
I have a calculated field "FirstNames" that contains the results of the calculation that I want to test. (Note that in this particular case, I wanted to compare two different ExecuteSQL statements to see which one was faster, but the same principle could apply to any calculation that you want to test.)
Instead of creating scripts and looking for triggers to attach to those scripts, I simply set a global variable $$StartTimer at the beginning of the "FirstNames" calculated field (using a Let statement), and then I created a new calculated field which I called "Statistics", and that calculation sets another global variable $$EndTimer (again, using a Let statement). I decided to put the EndTime as the second line item in this field, immediately after a function that copies the entire contents of the "FirstNames" field to a memory variable. (Doing this forces the timer to wait until the field is fully populated.)
Following is the calculated field called "FirstNames":
VAR1 = "SELECT statements go here..."
;$$StartTime = Get ( CurrentTimeUTCMilliseconds )
ExecuteSQL ( VAR1 ; "" ; "" )
Following is the calculated field called "Statistics":
VAR1 = FirstNames
;$$EndTime = Get ( CurrentTimeUTCMilliseconds )
;$DUR = ( ( $$EndTime - $$StartTime ) / 1000 )
"Time: " & $DUR & " secs --- Returned: " & ValueCount ( VAR1 ) & " items"
Keep in mind that the time results that you get can be greatly impacted due to "caching", so if you run the same test twice, it will take a lot less time (in my case, the initial query took 33 seconds, and repeating the same query took less than half a second).
Hopefully other people can use this technique, and if anyone has ideas on making it easier, better or more accurate, please let me know!
Just a quick word of caution:
The solution that I described earlier is fine for testing "down to the millisecond" speed of complex calculations in an isolated (test) table in your database. My solution doesn't provide accurate results if there is more than one calculated field displayed on the layout, and it only provides accurate results for the current record.
If I can find a better way to do this in the future, I will come back here and update this post.