1 of 1 people found this helpful
Your question could probably get into a lot deeper issues like how many variables and how big the variables are (e.g., they could be an incredibly long array, etc.). I can speak from my experience that I have never been aware of any performance issues with global variables and that I believe is because they exist in RAM on the client machine for fast access and are not stored on the server. Performance issues are much more likely to be effected by unindexed fields and unstored calculation fields. Anything that is read from the database will be slower than reading something from RAM. In fact, if you have some data you need to use over and over, you would probably get better performance storing that info in a global variable for recall assuming it is not changing data. This could be very beneficial if the data is an unstored calculation.
Brief summary: Global variables have a neglible impact on performance usually and can improve performance. There probably is a limit to how many and how big they can be, but if you are only talking dozens of variables, then that would not be an issue.
I personally would use more global variables as arrays if FileMaker had more and better functions for manipulating arrays.
Your explanation fits my thinking/assumptions on the topic. I usually advise my clients to ramp up the FileMaker RAM allocation preference for this reason. I've just never come across any technical reference about it.
As to my applications, both tables mentioned above, Parameters and Tooltips, are handled by CFs that reference 2 global variables: (A) something like $$tt_reference, which is a long Case statement that renders the possible CF parameters as array variations, and (B) something like $$tt which delivers the tooltip (or parameter) to the interface.
You really increase your functionality when you learn how to use global variables and arrays effectively. The $$tt is the most efficient way to do it even though I always find myself defaulting to GetValue ( $$tt ; 7 ). Then it make me wonder when there are two ways to do the same thing if one is faster than another. I'll have to write a loop that does it once a hundred thousand times or something and compare - haha. Is that not OCD of me or what? <grin>.
As Taylor has indicated Global Variales should give you a performance boost. Althought the boost may be small. Once the information is in the Variable, it is in memory so there is no disk IO to retrieve the information when needed.
Another advantage to the developer is that you can always see the current value in the Data Viewer by looking at the Current Tab.
The only issue that I am aware of are:
- It will only work in one file and not across files in a multi file solution. $ is local to the script, $$ is local to the file ... the non existing $$$ would be session wide. But a $$$ variable could easily be like opening a pandoras box of problems (example: Two solutions you start both has a $$$ variable with $$$thesamename
If you need to share a global variable across files ... put it in a global field:-)
Alternatively, I've pushed the value as a parameter to the file that needs the value. You make a good point though about being aware that global variables are specific to the file.
Remember also that global (or Session) variables exist only on the local computer which has the session open, so the impact is limited that computer and its RAM capacity. It's hard to image a large number of variables eating much RAM.
However, the calcs used to eatablish your $$variables may require extensive caching of records from one or more tables from the server to the local machine before the calc can be resolved. Such record caching may take some time, which will slow the script which sets the $$variables, and that can make even the setting of variables seem to slow down the initial file setup routines.
However, once set, $$variable performance impact should be no greater than a global field, and probably faster to read since it's in RAM when needed.
I think having the Variables set when the file opens covers most of this. The catch/solution is when the value is needed when runing a script in another related files. Here the solution is to pass the value as a parameter to the perform script step.
1 of 1 people found this helpful
There's also a security consideration when comparing global variables to global fields. You can use FileMaker's accounts and privileges to control a user's ability to modify to contents of a global field; you don't have that control over global variables. For example, anyone with FMPA can open the Data Viewer and use a Let() statement to change the value of a global variable for the current session.
on 2012-03-27 7:25 FCallanan wrote
I serve client database over the network from my FMS 11. In a couple of solutions, I've experimented with declaring several dozen global variables at startup.
For example I've use a Parameters table for long parameter strings so I can name and recall them easily (e.g. "pram=contact_edit"). Similarly, a Tooltips table allows easy recall of context sensitive tooltips (e.g. "tt=conatct_go"). My scheme for this results in many global $$variables.
Can anyone speak to the technical load dozens of global variables place on performance?
i think global script variables are a good tool for maintaining the "state" of
not a performance issue but not that using global script variables for
parameters (if i understand what you mean by that) means you need to avoid
collisions, where two scripts use the same variable name or where a recursive
script needs its parameters to be local; if needed you could solve this by
maintaining the global variable as a stack, though it would probably be easier
to pass the values to scripts as actual parameters
Right on, Rob. That global variables are outside the FM security model is a key consideration.
One issue that I have seen has to do with storing sensitive or privilege controlled data in a global variable ( or a local variable that is set via the Let () function ).
I have helped test a couple solutions that were populating global variables with various info that was needed to run various functionality, but was supposed to be locked down so low level users couldn't access it. However, they failed to keep in mind the security risks with keeping confidential and sensitive data in variables which are accessible via the data viewer.
If the data/values are not privileged info, then you don't need to concern yourself with it. It's not really a performance thing, but can have a big impact on how you use them.