First, let's make sure you look at the filemaker process cpu usage, not the overall machine CPU usage.
Second, do you use standalone solution ? (only your FMP client running it, not from an host)
If standalone, it shouldn't be possible to get only 13% on the filemaker process, because you have an SSD.
So in that case those 13% must be the overall CPU usage of the machine.
if you run the solution from an host, then it's a networking issue (use LAN ethernet cable, not wifi).
You could speed-up by a factor of 2, if you run the client en same machine as the server and if you access it using the 127.0.0.1 IP address.
Of course, if you run an hosted solution, maybe there's other things occurring on the server that hinder it. Like other clients locking tables.
FileMaker is hosted on my local computer. Nothing else is even using 1% of my CPU.
Below are the states for FileMaker Pro.exe from Taskmgr.
FileMaker Pro.exe CPU: 08 Memory: 247,420
i would look at optimizing the process first instead of blaming filemaker.
Must be a complex process operating on a very large dataset to take 17 + hours
I'm a mac guy, so I don't know what's the task manager number is : relative to overall cpu usage, or absolute CPU usage of the process.
I once had very few CPU usage while importing many small files rapidly. But it's particular.
small cpu usage is due to i/o constrains, or waiting for another external process (which is a kind of I/o)
What does your script do ? imports ? from files, sql ?
The script looks at a list of 5000 keywords (all wildcards) and scans it against several fields within my 4.7 million records. If there is a match, the matched word is added to a text field within the record.
*auto* => mikesautobody.com
*cpa* => findacpa.com
*dock* => terrysboatdock.com
The script looks in multiple fields - URL is just an example.
2 of 3 people found this helpful
Let's say that 'several fields' is at least 3, so that's 3 * 5,000 * 4,700,000 = 70,500,000,000 find queries without the write operations. Seventy billion, five hundred million.
At 0.1 second per request that's 224 years. At 1 second per request that would be 2,200+ years.
You need a different computing platform that can do things in parallel.
Yeah, that's going to be a problem... I'm pretty sure my data will not be valid in 2200 years.
I'm not a developer so I would need to ask, but is it someone easy to create a script that does things in parallel or is that a big overhaul?
2 of 2 people found this helpful
From under my rock, I do understand OP's original pain. I.e. launching a complex script on a SSD machine and seeing a very low CPU activity coupled to a low or moderate IO to the data container (HD).
I would expect everything to light up like a Christmas tree, hear the fans go mad, things like that.
Instead, I saw quite often the same situation happen, the situation reported by OP in his initial post, and I wonder if Filemaker's partnership with the cached data is the missing figure in the picture.
In a nutshell, the feedback we get from monitoring hardware resources is not consistent with the workload we threw at the machine via Filemaker.
Wim has a point, but what's interest me and what's not normal is the low CPU usage. That should be explained, a computing process runs at 100% unless constrained by I/Os.
@osensnolf : when you say hosted on your computer, do you mean you've intalled the server on it, or no server at all just the FMPa cleint running it's file ?
on optimizing your process.
Rather than accessing several fields millions times I'd export all your reccords with all the fields + RecordID you need to search into in a text file, and run command line tool such as sed / or grep 5000 times, then I'd capture the row numbers of the matching rows.
after that I'd have a table with
keyword 1 matches line 12,456,890,7890,3423
keyword 2 matches line 12,656,1890,990
then I'd get the record Ids of the matching rows
and , if I'd need to know the matching filed of keywords, then I'd, in filemaker go straight to the recordid of the matchings rows and research the fields that matches the keywords
So you'd be filemakering only against matching records
but you could also do that part in command line if you separate each records with a separator sur as tab.
That's only one option
thought it will be much slower, because filemaker is not good at this,
you can also do an excecute sql to put all you record content in a variable, and then iterate on this variable
The file is on my computer which runs FileMaker Pro (latest version).
Wow… I just stopped it and it made it through 600 records. That’s not going to work.
would it be possible to get your file ?
Agreed with Vincent, a different approach is needed. Things like VBscript / PowerShell on Windows are much faster in processing through text. May not be fast enough depending on how fast you need it to be.
The trick is going to be in limiting the number of operations.
of course it won't solve your case, but maybe that would explain low cpu usage : are you using the freeze window, blank form layout that every filemakerians have to use for any script to get less than glacial speed ?
I wish I could answer your question but I'm not sure I know the answer. All of the pages and forms I used have been customized by my developer.