0 Replies Latest reply on Mar 25, 2014 3:44 PM by tpolland

    ODBC credentials in script steps

    tpolland

      We have a filemaker solution that pulls data from over 200 different tables in an SQL Server database into tables with the same name and with fields that have the same name in the Filemaker database. We do this every few hours continually through a set of scripts with a couple of Import Records script steps and one Execute SQL script step all using the same ODBC DSN.

       

      Each individual script step requires that I choose a DSN and enter credentials for it. Whenever an end customer requires I change the password, I have to go into each and every import script and change the credentials on 3 separate script steps. One Import Records updates an import log with the number of records available to import. One actually imports the data into the Filemaker table. The final Execute SQL truncates the table in the SQL server database after I've verified that all the data was imported.

       

      As we grow our solution to include more tables this problem get worse. As I am relatively new to Filemaker (been at it for just over a year), I'm hoping that this problem is due to either a design issue or lack of knowledge on my part on how to set credentials for a particular DSN once and not have to change it everywhere it is used.

       

      The reason I designed the imports as a separate import script for each table is I could find no way to set the destination table other than the hard-coding allowed by the Import Records script step. So, while I can set it up to allow the SQL query dynamically select the source table, I could not figure out a way to make the destination table dynamic.

       

      I'm hoping that more experienced developers have already faced and fixed this problem. Either by some way of making a couple of scripts do the job, that I currently have 200+ scripts doing or by letting me know of a way that all script steps using the same DSN can be changed to use new credentials from a single source.

       

      I've included an example of the code for the scripts in question below:

       

      Set Error Capture [On]

      Import Records [No dialog; DSN: myDSN; SQL Text: select 'MyTable' as 'Table', count(TimeStamp) as 'TotalRows' from MyTable; Add; Windows ANSI]

      Import Records [No dialog; DSN: myDSN; SQL Text: select Name, TimeStamp, DataA, DataB from MyTable; Add; Windows ANSI]

      Set Variable [$ErrorVal; Value:Get(LastError)]

      Go to Layout ["MyTable"(MyTable)]

      Set Variable [$RecordCount; Value:Get(FoundCount)]

      Go to Layout ["Global Import Log" (Global Import Log)]

      Set Field [Global Import Log::Total Rows Imported; $RecordCount]

      Set Field [Global Import Log::Date Import Completed; Get(CurrentTimestamp)]

      Set Field [Global Import Log::Error Code; $ErrorVal]

      Commit Records/Requests [Skip data entry validation; No dialog; Force Commit]

      Flush Cache to Disk

      If [(Global Import Log::Total Rows Imported > 0) and (Global Import Log::Total Rows Imported = Global Import Log::Total Rows)]

      Execute SQL [No dialog; DSN: myDSN; SQL Text: truncate table MyTable]

      End If

       

      The destination table for the 1st import is the Global Import Log. The destination table for the 2nd import is MyTable. I have a duplicate of this script for each table in the database.

       

      Thank you for taking the time to read and hopefully reply to my question.

       

      Travis Polland

      Principal Solution Strategist

      Integrated Research