I think FMSQLException indicates that you have an error in your query. Try to recreate you query in FileMaker ExecuteSQL and see if you get result.
Try to recreate you query in FileMaker ExecuteSQL and see if you get result.
That won't work. FM's ExecutesQL() can not create tables or update records.
I agree it is likely a syntax error. Check the FM ODBC and JDBC guide for the supported standards and syntax. You may be using a command that is not supported.
I agree, you can't, but I hope mbeck65 means "and write a lot to the existing tables", talking about updating the records and not creating the new tables. So the actual problem is with a new table already created and not with creating a new table. I could be wrong, just trying to guess.
Of course, I'm only trying to write to an existing table ...
It seems that the problem isn't a syntax problem but a problem on the data. I try to explain: the table that I'm trying to write contains 110 fields. I have have verified that all the fields can be ignored during the record adding and so I don't expect errors in case of empty fields. I also verified the data type compatibility between the structure used in the executeBatch() and type of the FM table's fields. When I start the java procedure for table population I have the error that sent in my first message.
I have just terminate a test on which I wrote only 30 of the 110 total field and the table has been populated as good ad well.
I'm continuing with my tests: as soon as I have news I'll add another comment to this discussion.
In any case, thanks a lot for your suggestions that are however very useful.
Could be a number of things, e.g check the fields in FileMaker, they do not have to adhere to SQL naming conventions, so you have to escape them
First of all, thanks a lot for your support!
I've solved my problem: as you told me the issue was due to the SQL statement... Actually, I don't know where the statement was wrong but by modifing it and, most important, by simplifing it I solved and now the integration procedure works as good as well.
This can be caused by data validation failing on the records you are trying to write. If one of the records is locked, or fails strict validation, the batch will stop writing with this exception.