7 Replies Latest reply on Mar 4, 2013 9:13 AM by philmodjunk

    Where's the bottleneck?



      Where's the bottleneck?


           I'm wondering if anyone can help me track down what might be making my database so slow. For example, I just exported about 1000 records and it took 16 minutes.

      The database is remotely hosted and I can download the entire database, open the local copy and run the export in well under 10 minutes so, although I may not have the fastest internet connetion in the world, I don't see how that can be the only culprit.

           Also, if I want to run something like "Replace Field Contents" on an entire table (only about 50,000 records) it's something I'd set going and leave overnight whereas on a local copy it's the work of a few minutes. 

           In the past for very large changes, I've downloaded the database, run the changes and uploaded it again. The problem with this is that I have to make sure that nobody changes anything in the remote version while I'm doing it because of overwriting those changes.

           What I'm wondering is why it is so slow on the remote version. Is filemaker copying vast chunks of data to the local machine and making changes before copying it all back?

           I'd assume that for, say, the Replace Field Contents, the step would simply run on the remote machine in the same way that it runs on my local one - that is, all that would need to be sent over the connection would be the name of the field and the update to make.

           Could it be that we've just got a very slow hosting service? Perhaps they're just hosting the database on an underpowered machine. Is there any way to check these things? 

           Anyway, I'd be grateful for any pointers folk can offer on where I might look.


        • 1. Re: Where's the bottleneck?

                    Is filemaker copying vast chunks of data to the local machine and making changes before copying it all back?

               My guess is that this is indeed the issue.

               You might try setting up a server schedule to run your operations as a script and then use the admin console to run that server schedule when you need to perform this operation. That schedule performed script should run fully on the server.

          • 2. Re: Where's the bottleneck?

                 First of all, thanks for that Phil - as always.

                 However, I'm not sure I understand how to do what you're suggesting. To take the most common example, once or twice a week we want to run an export of all records that have changed since the last export. To this end, there is a script which does a search for all records where the modified date/time is after the date/time stored for the last export. The found set is then exported to the local machine. This sometimes takes hours to complete if we've updated a lot of records.

            If I created a server script, would the file be exported to somewhere on the server? I believe we have a certain amount of storage space along with the hosting so that's probably not a problem.

            I'm not clear how to run the script on the server though. I can't find the Admin console - could you point me in the right direction, please?

            • 3. Re: Where's the bottleneck?

                   The admin console is the java appliation that controls  Your hosting company should grant you access to using the admin console to administer your database, I would think. You'll need to contact them for technical assistance on that.

                   Yes, you would need to set up a system where you generate the export to a location local to the server and then would download that file to your local machine if you are keeping this data as a backup. It may be simpler, though to just back up your entire database file on the server and then download the backup copy onto your local machine.

              • 4. Re: Where's the bottleneck?

                     OK, I'll ask the hosting company about that.

                     The weekly export isn't for a backup. Rather, we send the changes on to several places, for example, our website.

                     I gather that in our UK office, the guy who does it aleady logs in to the hosting company's server and downloads a copy to run the export from as it's so much quicker. The problem is when he forgets to update the export date on the live version and then, the next time he does it, he gets all those exported records again. It would be nice to automate it as far as possible.

                     I think I'm struggling with the client/server split in Filemaker - there doesn't seem to be any reason for a Replace Field Contents to have anything to do with the local machine. Or am I missing something? I suppose it depends on which machine maintains the found set that I'm updating. If that's local, perhaps the replace has to be local too.

                     Once again, many thanks.

                • 5. Re: Where's the bottleneck?

                            The weekly export isn't for a backup.

                       Ok, but copying a backup file to your local computer in order to export this data may still be a valid way to accomplish this and much faster than using Export Records to export the data to your local computer--as you have already found out.


                            I think I'm struggling with the client/server split in Filemaker

                       That's not something I can discuss with any real authority as I am just a fellow user of this product. From what I do understand, FileMaker is not a "thin client" by any stretch of the imagination and a great deal of the processing is done on the local machine. I do know that scripts performed by "schedule" via FileMaker Server's admin application, perform from a "host" context and thus this will all take place on the server.

                       I don't know if your hosting service can support this, but a server schedule can be set up to automatically export the data, so long as you export it to one of the few places permitted for scheduled script exports. An OS based task scheduler can then perform a system script such as VB Script to move that data into a remote location or you can, perhaps, access that folder directly in order to download the exported data onto your local machine.

                       See this knowledgeBase article on server scheduled scripts that import or export data: http://help.filemaker.com/app/answers/detail/a_id/7035/kw/server%20import%20records

                       Please note that Replace Field Contents will not only update the field, if the field is indexed, it also triggers an update to the index and an index's size will increase greatly with the number of records in the table and with the number of different values in that field.

                  • 6. Re: Where's the bottleneck?

                         Sounds like I've got a few questions for the host company. They've been pretty good in the past and are generally friendly and helpful.

                         Thanks again.

                    • 7. Re: Where's the bottleneck?

                           Please also note that if a user is editing a record that replace field contents is attempting to modify, the user's "edit lock" on that record will prevent the record from being modified by your script. This will return an error code that you can test for, but this will not tell you which record or records were locked in this fashion.