This is starting to become a point of frustration with FM. I've noticed with a large database (multiple tables with over 10 million records or more each), usually has devasting performance issues.
I'm running a script on 9 million records, look for duplicates in a specific column and mark a flag column with "1" and then perform a find on that flag column with a 1 in it.
I let it run since yesterday morning, almost 24 hours. I come back today to check on it and FM is in a "Not Responding" mode with memory usage stuck at a static 1,163,454 K, it continued to be nonresponsive for well over an hour before it became responsive again since I've checked on it today.
The FM server was updated with a new larger SSD and RAM. I wouldn't think that this would take this long to process or become nonresponsive for such a simple script. I've noticed FM would become unresponsive with other tasks as well, importing large queries, complex searches, etc.
I know this is a general question but is this an issue with FM when dealing with a very large DB?