Do you have unstored calculations or summary fields on your layout? That could potentially be crippling your database trying to transfer all that data across for evaluation.
And please tell me you're not trying to search 18 million records based on an unstored calculated field. Transferring 500k records worth of data across too is quite a bit.
Probably not a bug, more likely your data and search methodology.
I'm not a developer but I'll try to answer the best I can...
The developer built a feature that has a large text box that allows me to drop in a list of things I want to search for. For instance, a list of first names, emails, URL's, etc. I then select the field I want to search for. So, if I am searching for First names, I paste the list of first names I want to look for and then I select First Name from the drop down. Each name has == in front of it (==Mike) so that it finds exact matches. I then do the search. The tool searches across all 18m records and looks in the First name field to see if the name matches one of the names I have in the text box. Once finished, the found set is all match results.
The search I ran yesterday is an ID (so again, each looked like ==2727 so that I would get an exact match).
I have done several searches like this in the past and it always takes a long time. The only thing different is that this time I saved the Found Set so that I could (hopefully) more quickly pull the matching records when I need them next.
Filemaker never hung during the search or after the search. It was only after I saved the result, closed it later and then opened it today that I noticed the difference.
I hope that helps.
I mean, it helps explain a little as to what your system does, but nothing really towards how it does it. You might need to hire a developer more familiar with dealing with massive data sets in FileMaker and hire them to correct any development issues contributing to your speed.
Chances are the speed can be greatly reduced with just some simple work, such as using auto-enter fields instead of calculated ones, and making sure certain fields are always indexed.
I do have a very good developer already and the fields are indexed.
The search does not appear to be an issue (yes, maybe it can be faster), but my problem is that Filemaker hangs after saving the very large Found Set of records.
In the past, I can search for up to 800k records and work with the results without any issue. But I have never saved them before.
My guess is that if I go to the backup copy and search for them, the search will complete and Filemaker will work like it should but after I save, then it will be an issue. If I do not save, no issue.
I'm trying to figure out way saving a large Found Set is causing it to hang.
Try the following:
- create a new database on the desktop
- enter script workspace
- create a new script
- add a Perform script step, instead of current file go select your big file
- don't select a script, just hit cancel
- now your big file should be selectable in the window menu
- select it and try reaching the records -> Saved finds -> edit saved finds , if you manage it, nuke em all.
UPDATE: I had already started the process before siplus comments, but I went to Records > Saved Finds > Edit Saved Finds > Delete Saved find and instantly, my database is working like it should. No hang at all!
So... why is this happening?
Should I just chalk this one up as 'cant do that' and move on?
Are saved finds somehow stored in memory? Not sure why deleting it would instantly cause my database to work like it should.