I'm not a MySQL expert, but I believe MySQL supports Java stored procedures. I would write a stored procedure that does what you need (should not require any importing/exporting), and then setup a scheduled database job to execute it. Be careful of record locking and update issues with your main app and users. Remember, ESS data changes are not automatically refreshed and pushed out to Filemaker Clients, so any Layouts that use this data, you want to be sure you have scripts that refresh the data and present the user with the latest updates.
Refreshing data for the clients is a non-issue. None of the layouts the internal users utilize contain anything from the SQL database. I just simply created a layout based on the SQL database for internal use. This is just a one way exchange of data, FileMaker -> SQL.
Could you elaborate on what you mean by a stored procedure?
Unless I am misunderstanding your new configuration, you can dispense entirely with the export to a file/import from a file workflow.
With your FileMaker database connected directly to your MySQL database via Table Occurrences, I think now boils down to the simpler FileMaker question --- How can I create records in the second table FileMaker based on the values in the first table?
While you can import directly from one table to another by using the FileMaker database as both the source and the target, you can also create the records one by one in a looping script.
The looping script may be easier to debug.
Find the records you want in the first table.
Navigate to the first record in the first table's found set.
Save the two values you want into script variables (e.g. $web_username and $web_password)
Navigate to the layout containing the second table.
Create a new record
Set the fields to the variable values.
Go to the first layout
Navigate to the next record...[exit after last]
Since you have a direct connection to the MySQL database you have the luxury of being more sophisticated about only updating the records that need changing instead of deleting and repopulating all of them. You can set up a relationship between the two tables using the web_username field. Do a search directly in the MySQL table (using related fields from the first table) to locate and then delete only the lapsed records. Then locate and create only the new ones from the first table.
In MySQL they are called Stored Routines:
Basically they are server side scripts. The advantage being they are usually much faster for manipulating SQL data then working through Filemaker. If your updating data from a Filemaker Database to SQL, then you will have to stay in the FM environment, or you may be able to query the Filemaker data from MySQL via ODBC. It is all a matter of what your comfortable with. Sticking with all Filemaker scripts is relatively simple, but you have to work a way to run your script on the Server. If you have the ability to write a routine in mySQl it could potentially run much faster and would off load some work from Filemaker.
id some testing and i've found that deleting all records, followed by finding those on subscription and looping to create new records is painfully slow.
Here is the script i setup:
I ran this on my local machine and it took upwards of 15 minutes to run based on 4366 records. Not sure if it will be any faster server side but if it takes even half as long, that's going to cause problems with clients who are attempting to access our website during the time the script is running.
I did setup a relationship between the two tables: TechSupport::web_username = web_members::username.
How would i go about doing a compare to add/delete only those records that have changed?
Ill look into this, as you can see from my post below, deleting all and re-creating is extremely slow.
Doing record Deletes in ESS is AGNONIZING.
If you can write a Stored Procedure or use the Execute SQL _SCRIPT STEP_ against your database directly, you can use this command:
One command and it clears all data from the table. Needless to say BE CAREFUL, this is not a reversable command, it can not be rolled back. But it is fast and efficient.
This may work but unfortunately Execute SQL is not a compatible server side script step.
I'll dig into how to write a stored procedure and see what i can come up with.
If you would like to see it become a Server capable command, send a feature request to FMI. I've been begging for this for years, along with the ability to use the ODBC drivers on the Server to execute SQL on the Clients.
Once you have a stored routine, you can fire it with a scheduled job:
Sorry for the mySQL links, but I'm more familiar with Oracle and SQL Server stored procs.
To find and modify only the changed records when using TechSupport::web_username = web_members::username...
You can do something like the following.
Delete records that need to be deleted in web_members:
Find all TechSupport records that need to be kept.
Go To Related Record[Show only related records; Match found set; From table: "web_members"]
# You are now looking at the web members you want to keep.
Show Omitted Only
# You are now looking at the web members you want to delete
Delete All Records
# Hopefully this will be a relatively small number of records compared to the entire set.
Find all Tech Support records that do not have a corresponding entry in web_members:
Go to Layout["TechSupport" (TechSupport)]
Find (search criteria you use to locate TechSupport records that need to be in web_members, new request to omit "@*" in the related field web_members::username )
# You are now looking at desirable TechSupport records that do not have a corresponding web_members field
Create the new web_members using this found set.
There are a few ways to accomplish the task above.
You can do one Find as shown above.
You can show all records web_members, go to related record[only related;match found set], show omitted only then Constrain Found Set to those ones you want to add.
You can add a new unstored calculated field to TechSupport: [ cHas_web_member = If (IsEmpty(web_members::username); 1; 0) ] and use that as part of your search instead of "@*" in a related field.
If you need to spot records that do not have the same value for web_password, then you'll need to modify the above to incorporate that criterion.
Not sure that this will be any quicker. Just performing the first 2 steps takes 4 1/2 minutes. I'm really disliking SQL lol.
Perhaps this may be an opportunity to utilize CWP and have our login area link directly to the FileMaker Database instead of SQL.
Don't put this on SQL. SQL is very fast. Filemaker is very easy. There is a great deal of overhead in how Filemaker interacts with SQL. That's why you need to learn to do as much on the server as possible. Filemaker is a great interface, but a poor conduit for large record operations.
The approach that we take to updating data in (or from) databases outside of Filemaker is to use an Extract, Transform and Load (ETL) tool. Our tool of choice is part of the open-source Pentaho BI project, and is called Kettle. Using the Filemaker JDBC driver, you can easily connect Kettle to a Filemaker database, perform a SQL (ODBC) query against the database to get the records you want to move, perform any data manipulation necessary, and then insert those records into your MySQL database (which can be accessed natively by Kettle).
The best part about using an outside tool like this is that the process of moving the records around is VERY FAST. In the transform step that INSERTs records into a database table, there is a little checkbox that says "truncate table" - it's that easy to zero out the table before doing your insert! Kettle (ETL tools in general) is optimized for and used by databases that move millions of records around on a daily basis. Running your process every 15 minutes can be accomplished by cron, and the data will be kept up to date without much hassle, once the process is set up.
Granted, this doesn't utilize ESS (which is a very cool technology), but it would solve your speed issue, and would add a tool to your data toolbox that I'm sure you could find other uses for.
Pentahoe Kettle is a excellent and fast Data Transformation tool. Cross platform, fairly easy to learn and I've had great results using it with Oracle and Filemaker. It is more or less a standalone tool, but extremely versatile.