One way to do this "programmatically" would be to use an xDBC approach. Using Java, for example, and JDBC, you could read the metadata before each export (if needed) using the DatabaseMetaData that's part of JDBC. Using that, you could create a second database, if needed, populate fields and such.
Then, step 2, loop through the FM data and copy from one connection (FMP) to the destination connection (SQL Server, Oracle, MySQL, you name it.) Or you can just write to a file using something as basic as a FileWriter.
I do this all the time and FileMaker's JDBC driver is excellent.
Some third-party DB tools like Navicat support multiple databases and you can literally drag and drop fields between tables of different databases. Navicat has the export and such, but unfortunately doesn't support a generic JDBC connection type yet.
I'll look forward to seeing the other approaches suggested here.
The export dialog allows you to specify the table (not just the layout) and you can MOVE ALL fields for the export (whether by layout or table). You can also export related fields, but for "dump" into another DB, you don't want those.
BTW, your "move all" uses the sort order for records, if you sort before exporting. If you have calculated fields, these can be exported but will be the value at time of export. You may also specify formatting as on layout (if they are on layout) for numbers, dates and times. This may help you pass a date "3/23/2016" as "2016-03-23" if it has that format on the layout, for example.
Beverly, appreciate the idea, but I am needing to do this programmatically, because it will be a repetitive (nightly) task. I am trying to avoid having someone sit in from of a dialog and "Move All" on all the data in our system.
About you SQL script which is slow, did you create a variable from ExecuteSQL that you wrote with BE_WriteTextToFile to a file ?
It's reasonably quick.
If you did this in Java, outside FM, you could simply schedule the task on whatever system you have (that is, a Windows Scheduled Task, a Cron Job, etc.).
No user interface. Batch approach. Easy.
Another option might be FMS , but I don't use FMS but maybe it has a way to schedule a task like this?
The FileMaker application would need to be running, but that's it. No user required.
Depending upon your data, you might go to a layout with the fields you're interested in, find the records you're about and run a script which does
Copy All records
Go to layout [export]
Export field contents [myfile::gExportField]
if u see what I mean.
I think the OP knows this, but wants no user interaction: a scheduled task of some kind.
// calc $path here
Set Field [myTable::gExportField; ExecuteSQL("SELECT * FROM myTable"; ","; "" )]
Export Field Contents [myTable::gExportField; $path]
True, but The BE_WriteTextToFile doesn't require a field for that, and it's server side compatible.
If the SQL part is too slow, and if the export all is faster, which is possible. You could do this (not for the faint heart):
You'd need a Mac Filmaker Advanced client Robot, that would be for optimal performance on the same mac which host Filemaker server. You'd need base element plug-in and applescript.
With clever Apple scripting you could create a filemaker script programmatically : You do an ExecuteSQL to query the filmmaker table schema to get the filed you wan to export. You then create XML code that's the same as you'd get copy pasting an export script steps, except you'd have put in the field from the table schema. With BaseElement plug-in you transform that xml in passable xml for filemaker. With AppleScript GUI scripting you open script maker and paste. That would create the export script. Then you launch it with applescript.
That's high level stuff, make sure it's worth it speed wise by testing the export all speed vs the SQL + BE_Writetext
And back to the OP's ... 'Quickest Way"...
Using Java, the job is simple and straightforward. No cleverness required. Schedule a task. Done. Database dumped on whatever the schedule is.
And, no plug-ins required.
Assuming the idea above could be automated, perhaps it's a good approach, but I would argue from experience getting data from FileMaker, not the quickest.
In a simple table with just PK, firstName, LastName and FullName (calculated), holding 100'000 records, the following script executed in 1059 ms.
It all depends on the data, but there's no user interaction. The script can be scheduled.
Set Variable [ $Start ; Value: Get(CurrentTimeUTCMilliseconds) ]
Set Variable [ $Path ; Value: Get(DesktopPath) & "test.csv" ]
Set Field [ QuickFindPeople::gDump ; "" ]
Set Field [ QuickFindPeople::gDump ; ExecuteSQL("SELECT * FROM QuickFindPeople"; ","; "") ]
Set Field [ QuickFindPeople::gDump ; "Total time: " & Get(CurrentTimeUTCMilliseconds) - $Start & ¶ & QuickFindPeople::gDump ]
Export Field Contents [ QuickFindPeople::gDump ; “$path” ]
The question is what's the fastest. I don't use filemaker JDBC driver, but I think it must be slower than traditional export, and maybe same speed as SQL. If you know that JDBC is faster than native Export script step then I'm very interested.