1 of 1 people found this helpful
Scripts that run reasonably fast on a local computer with locally stored files can be slow when used via a server across network. The culprits are, as you mention, unstored values due to calculations with local globals or relations that leads to non indexed values.
This can be solved by using stored values - fx using scripttriggers or auto-enter-values instead of ordinary calculations.
Very often it is also possible without changing so much.
Are you used to using FileMaker Server?
If not, having somebody with server experience helping you might be the way to go if you need this solved first.
Your performance problem could also be related to hardware and network infrastructure. Your client and server should be on a reliable network with fast Round Trip Time (no slow firewalls or bad network components in between) and the bandwidth should be at least 100 Mbit up/down. GB network would do fine as well and is getting to be the standard for local networks.
Hope this is a help
Thank you for your reply. I will look into opportunies to use auto-enter values, but it is sort of an inventory solution, and the looping script depends on updating, even within the script. I wonder if the commit step would actually slow it down again.
I am comfortable with Server, but have no control over it really, since it is at a client's place. I will check with the administrator again after the holidays to see if we can do some speed testing.
1 of 1 people found this helpful
Since it was stopped after 15 minutes, you don't really know whether it was just slow or whether it was hanging, right? Other than network issues mentioned by Carsten, I would also mention:
1) Open the script and, below left in Show Compatibility, select Server. Then look for any grayed out script steps in your script.
2) Ask for a copy of the server logs. They can provide a lot of information about what happened.
3) Are you using a separate data file and if so, how is the file referenced? The other file should be only the file name and not a full path.
4) Does script perform any imports or exports and if so, where are the external sources located?
Running a served file is simply an eye-opener. Even with top-of-the-line networks and equipment, the difference in speed is surprising to most.
The fact that you experience a signficant delay when you go from local to hosted is the red flag. Under most circumstances, this is network related. Not necessarily because your network is slow (although that can be the case), but because of some "under the hood" behavior FileMaker has that can hurt you if you're not aware of it.
FileMaker's client / server model is very "record-centric". When a client requests a record from the server, it gets it. ALL of it - meaning all fields on that record, whether they're currently needed or not (with a few exceptions like container fields, unstored calculations, and ESS fields that haven't been displayed). But it's not just one record at a time. When the client navigates to a layout, it will request a set of records from the server. In Form View, it'll ask for 25 records. In List or Table view, it'll ask for as many records as can be displayed (dependent on the configuration of the layout, screen size, etc.). Additional records will be requested as needed.
In your case, the first thing that jumps out at me is your reference to bulk replaces. There are some operations that require FileMaker to load the entire found set of records from the server to the client over the network. That set includes sorts, imports, exports, and - you guessed it - Replace. This is most likely at least one of your bottlenecks.
Perhaps you can take a look at your table structure. One thing to look for is how many fields are in a given table. If you have more than, say, 3 dozen or so, your structure might need some work. If you have this situation, you can dramatically improve performance by splitting your table into a "maintenance" table and a "normal" table. Using a one-to-one relationship, you can put just the fields you need for the maintenance operation into one table, and all the other stuff into a second table. This will allow FileMaker to load just the stuff for the maintenance run over the network, which will dramatically improve its speed.
Other things to watch out for:
1) Graphics on layouts: Make them lean (few colors, relatively low resolution - PNG or JPG).
2) Don't use different graphic objects, which have to be loaded separately. Only make the client load them once by using common graphic objects across multiple layouts.
3) If you're doing an operation that requires looping, use a layout that doesn't have any fields on it. Nothing to refresh, faster operation.
Those are just some things that pop up off the top of my head.
Thank you for your reply to my post.
It is true that I do not know for certain if it was hanging or just running slowly, except that I examined the data and it did look like it was working at least through the beginning of the script, and the affected data responded to the "undo" script with complete success - no errors, and the undo data massage script returned the data to the original pre-scripting arrangement. The undo script is very important, because each record follows the trail of the 'do' script and is manipulated at least once, if not several times.
1) I thought about the the Server Compatibility script filter also, but with a little research in the help, I came to the conclusion that the filter is for Server side scripting. I think there is a distinction between scripting for databases hosted on Server and scripts that are run by a schedule on Server. The latter is what the filter is for. In any event, all the script steps are fine except at the end, when the dialog box pops up to indicate the process is complete, which would be an ok time to stop, but it didn't get that far anyway.
2) Good idea about the server logs! I'll check that out when they get back from vacation.
3) not using the data separation model on this one, but it is interesting that the file name only would be preferable to the full path. hmm.
4) no imports and exports.
eye opener - ok!
Well, the client did report the routine activities they were doing were going well - and they have a lot of them, so that is encouraging.
Thank you again for your suggestions LaRetta!
My apology, Laura, I assumed that such a process-intensive script would be ran server side and probably at night. Anyway, good luck on working it out.
Thank you for your response to my post.
I do have some indication that the client's network is not the super fastest, and I will be checking with IT to notify them about the need for speed on these connections.
It is an excellent suggestion about limiting the fields per table. Also important about the graphics, and using a blank layout for looping scripts.
I am using a join table for this, so there are very few fields in that. However, I wonder if the related fields on the layout - that are searched on and also get 'set field' script steps - call the whole record when accessed!! Can this be known? That would certainly be a drag on the system. If that were the case, it would really be a pain to make it local to the join table, or at any rate, I would have to collect the info from the related table, do the record level change, then replace it to the related table, then call it again later and do the same thing over and over. That would take overhead too. Unless I suddenly became super smart and saw a way to reconfigure everything....
The way this script works is to alter a number associated with the related parent table each time a related join record is altered. The number in the parent record is used to qualify the related join records for more altering or less altering by ranking. So each time a related join record is altered, the rank on the parent record is changed, and the next qualifying round of looping depends on that ranking number having been updated in the parent record immediately afterward the join record is altered.
I will definitely see about the blank layout for the looping. That should be a powerful improvement right there!
Thank you again, Mike.
Right LaRetta. That would likely be the case, and would likely hang it if it weren't addressed, so I understand why you suggested that now.
In this case, the user has the requirement of stepping in the middle of it all, making some human tweaks and running the rest of it, so cannot be done that way this time.
Sorry I didn't think of that.
Thank you again!
What is with these Helpful and Correct buttons? Sometimes they show up and sometimes they don't! All the answers are correct and helpful, but I am limited in marking them by which buttons show up!! Sorry if I am not marking your suggestions the way they deserve. I am going with what is available.
To answer your first question, searches on related fields are not a drain on performance, provided the related field can be indexed. They are executed as fast as searches on fields in the same table.
To answer your second question, yes, the entire related record is fetched when needed. That includes when it's displayed, and not just the fields that are present on the layout, but ALL the fields in that record. If you're using a portal, then ALL the fields in ALL the related records that are currently displayed in the portal will be fetched.
(This is an argument for minimizing the number of fields in every table - using tools like Conditional Formatting to move business logic out of the data layer and into the the interface.)
However, another thing to look at is dependencies. How many layers of calculation are you asking FileMaker to process? For example, say I have a number field in a table. I then define a calculation field in that table that's based on that number field. Then I define a lookup based on that calculation field. Then I create a Conditional Formatting setup based on that lookup.
Now, say my script updates the number. Watch the fun ...
The number is updated ... which causes the calculation to update ... which causes the lookup to trigger ... which causes the Conditional Formatting calc to trigger (assuming it's being displayed). Each of these steps has to wait on the previous step to fire, so they can't be processed concurrently. This will hurt performance.
I can't tell from your description whether this is what's going on, but it looks like you have A depending on B, which may or may not depend on C. At least it's something to look at.
Use all of the advice that Mike Mitchell gave. But also, things run faster as a server side script than a client calling the script from the server. Most of the end users are not going to know how to schedule a server side script with admin console to run in the background. What you can do is setup a table that a Server script looks at that runs every minute, looking for any new entries and when it finds one, that entry tells it what script to run. In this way, you set up a FileMaker client to be able to schedule batch runs on the server while not locking up the client machine. You can have this layout show when the job is done or you can have it send them an email or text message or whatever alert to let them know it is done. Things will run much faster on the server than on the client connected to the server. And you get the added benefit of not locking up your client machine while the script runs.
Thank you for that suggestion. This script is run in three parts, with human intervention in between one of the parts (client requirement) 6 times a year, after a bunch of data entry. While scheduling the script has to rely on users deciding it is time to run it, it would be great to get it off the client and onto the Server side! I will see if a Server schedule of this script could be triggered by the designated db admin onsite upon request, or maybe I could do it if they allowed access with FMAdmin. That way, all that back and forth through the network could be eliminated. It would look much better to the client, too.
I definitely prefer the suggestion from user7471, Taylor Sharpe's idea of somehow allowing the client's changing some data and having the server script triggered by it. Trying to see how I can make that work for this.
This has been a wonderfully enlightening discussion. Thank you!
Reply to first question - Great! So you are saying that indexed related fields will not pull all the data from the related record like unindexed fields will. Therefore, my join table location for the script is good, just probably should be blank, due to the looping refresh issue, and I should look for any unindexed or calc fields and try to minimize them.
I really do try to keep things simple, so I apprecitate your multi-layer drag suggestion.
I am intrigued by your comment about having conditional formatting working business logic. Could you give me a breif example of what you mean by that? It can't really display conditional data can it?