7 Replies Latest reply on Jan 11, 2017 2:18 PM by shmert

    xBDC driver issue

    mbeck65

      Hello everybody,

      I'm writing a java procedure that help me to integrate my FM application deployed on FM Server with an AS400 legacy system.

      All is working as good as well: I'm able to open the JDBC channel, to log in to the FM application and write a lot of tables but... I'm trying to make same thing with a new table but I have the following generic error that I'm not able to understand and explain:

       

      Exception in component tJDBCOutput_1

      com.filemaker.jdbc.FMSQLException: [FileMaker][FileMaker JDBC] Batch entry stopped.

        at com.filemaker.jdbc2.CommonJ2Statement.executeBatch(Unknown Source)

        at mbeck.aaaa_0_1.aaaa.tFileInputDelimited_1Process(aaaa.java:4925)

        at mbeck.aaaa_0_1.aaaa.runJobInTOS(aaaa.java:5263)

        at mbeck.aaaa_0_1.aaaa.main(aaaa.java:5120)

       

      It would be very useful for me to have something like a detailed/verbose trace of the problem so that, I hope, I'll be able to make changes and correct this strange behaviour.

       

      Thanks a lot for your ideas/suggestions.

       

      Max

       

        • 1. Re: xBDC driver issue
          nicolai

          I think FMSQLException indicates that you have an error in your query. Try to recreate you query in FileMaker ExecuteSQL and see if you get result.

          • 2. Re: xBDC driver issue
            wimdecorte

            nicolai wrote:

            Try to recreate you query in FileMaker ExecuteSQL and see if you get result.

             

            That won't work.  FM's ExecutesQL() can not create tables or update records.

             

            I agree it is likely a syntax error.  Check the FM ODBC and JDBC guide for the supported standards and syntax.  You may be using a command that is not supported.

            • 3. Re: xBDC driver issue
              nicolai

              I agree, you can't, but I hope mbeck65 means "and write a lot to the existing tables", talking about updating the records and not creating the new tables. So the actual problem is with a new table already created and not with creating a new table. I could be wrong, just trying to guess.

              • 4. Re: xBDC driver issue
                mbeck65

                Of course, I'm only trying to write to an existing table ...

                It seems that the problem isn't a syntax problem but a problem on the data. I try to explain: the table that I'm trying to write contains 110 fields. I have have verified that all the fields can be ignored during the record adding and so I don't expect errors in case of empty fields. I also verified the data type compatibility between the structure used in the executeBatch() and type of the FM table's fields. When I start the java procedure for table population I have the error that sent in my first message.

                I have just terminate a test on which I wrote only 30 of the 110 total field and the table has been populated as good ad well.

                I'm continuing with my tests: as soon as I have news I'll add another comment to this discussion.


                In any case, thanks a lot for your suggestions that are however very useful.

                • 5. Re: xBDC driver issue
                  nicolai

                  Could be a number of things, e.g check the fields in FileMaker, they do not have to adhere to SQL naming conventions, so you have to escape them

                  • 6. Re: xBDC driver issue
                    mbeck65

                    First of all, thanks a lot for your support!

                    I've solved my problem: as you told me the issue was due to the SQL statement... Actually, I don't know where the statement was wrong but by modifing it and, most important, by simplifing it I solved and now the integration procedure works as good as well.

                     

                    Thanks,

                     

                    Max

                    • 7. Re: xBDC driver issue
                      shmert

                      This can be caused by data validation failing on the records you are trying to write. If one of the records is locked, or fails strict validation, the batch will stop writing with this exception.