Don't know for sure, but it probably has to do with how FileMaker updates the indexes of your records.
Regularly deleting and recreating over 300,000 records seems like a very unusual database design approach for FileMaker. Perhaps there's a different approach that would avoid the need for this.
Oh this is unusual usage for this database and it won't happen regularly outside of development cycles that hose the main data set when I make a mistake. It would be nice to know how to figure out how to delete faster though.
Try using the save a copy as option with the Clone option. This saves a copy of your database with all the tables empty. Even if you only want a few tables empty, you may find it quicker to save a clone and then import data from the other tables (hopefully those with small numbers of records) back into the clone. This import process can of course be scripted so that you can get data from multiple tables with a single script if that's needed.
You might also consider the data separation model to keep interface and data in separate files if you don't already do so.