The locally served file has an intensive script routine that is required 6 times a year, and acts on a good-sized and growing set of records in a join table. Served locally, the client was blown away that it took only about 5 mins to complete. Recently, they moved the program to Server on their network, and started to try the intensive script as usual. After about 15 mins, they got worried, and cancelled the script. They said they felt it wasn't progressing as fast as they expected. I directed them to pull it down and run the script locally so as not to interrupt their process. I ran the 'undo' script and the local run went as usual and then they put it back on Server - all is well. Now I am looking at the code of the 'intensive script' and want do some testing in light of the new hosting configuration, and in preparation for the next time the script has to run. I do not believe it got hung up, as the data looked in good shape and responded to the undo quickly and perfectly, but I am going to test it with script debugger tomorrow, when I have a better connection to my server. I looked at my local variables, and did not see any role for default values going on.
The script has global fields, local variables, set fields, multi-entry finds, multi-step loops, bulk replaces. Does anyone have a list of items they tend to be concerned with when moving from local to Server hosting? Especially things that slow down scripts?
The only thing I can see is a few unstored (can't be stored) calculations that might take a while. I am wondering if it would take longer to take the values from those calcs and put them into indexable fields before starting the 'big' script than to run the finds on them unstored.
Is this enough info for you to answer the question?
Just looking for ideas.