5 Replies Latest reply on Jul 28, 2016 4:34 PM by cortical

    Import Verification/Clone Files/Best Practices for Files

    bradleyboggs

      Thanks for looking and for any guidance you might provide!

       

      I'm trying to transition to the correct way of doing things with a Development copy of the file I'm working on. Finally looking into importing/updating data, I've figured out most of what I need to know to import data from the current production file into the new file. My main question: Is there a way to do a sort of checksum/verification to make sure that after import, the date in the new file (the file that was used for development) matches the data in the previous production file? I've searched the forums and Google to no avail.

       

      I'm getting ready to write an import script to automate the import of our 35 tables (plus 3 other connected files), and I just wanted to see if there's a way to make sure everything imported correctly.

       

      Secondarily: I don't have a clean "Clone" (just barely learned about them today). I know I can't make one from nothing since all copies I have of the file have been in production at one point, but what's the closest I could come? Take a known file with no apparent issues, run a recover to check for issues, then export the clone (which eliminates all the data). How would you handle this?

       

       

      Blathering Background info/request for best practices:

       

      Let me preface this by saying I know that my practices as mentioned below are not "Best practices" (and actually probably pretty heinous in most of your eyes). I'm here because I'm trying to learn how to remedy my errant ways and do things the right way.

       

      I'm an in-house developer at a manufacturing company. Between office employees and ipads in the factory, we have about 30 users. We've been running FM12 since I started the system from scratch in late 2013.

       

      Here's the cringe-worthy part: 99.9% of the time I've spent building the system, I've been working on the production database. Stupid and incredibly reckless? Yep, that's me! This isn't a story of a crashed database though. However, just yesterday I had what looks to be a corrupted index that effected only one record, seemingly related to an FMGo user. It looks as though that solution is now fixed (after trying a delete/reimport to no avail, I stopped the server, copied the database, and rebuilt the index using the recover tool (only rebuilt the index - disabled all the other stuff in recover). Seems to have solved the problem.

       

      Best Practices: I've found Filemaker's official "Best Practices" regarding not working on production files, the need for keep empty clone copies, etc. and I'm working to be compliant with that. Any other recommendations/guidance?

       

      One note: I'm getting ready to upgrade us from 12 to 14. I have a VM copy of our FM server and will be doing a trial run there first before moving forward with the production machine. Any tips/guidance here are also welcome.

       

      In general, i just want to start doing things the right way and protect the business from some catastrophic failure because of my own stupidity. I've been lucky thus far - but luck will run out eventually.

       

      If you made it this far, thanks!

        • 1. Re: Import Verification/Clone Files/Best Practices for Files
          taylorsharpe

          FYI, every time the FileMaker server opens your file, it does a checksum to verify its validity.  You can always run a recovery to see if any problem are found.  But between the OS level checksums to FMS's validation, its not real likely you are loosing data.  You are much more likely to get an index corrupted.  When you have corruption problems most often is when people do things like virus scan or backup a live (open) FileMaker file. 

           

          Best practice is to run a backup on the server, then stop the files, save schedules/groups, then uninstall FMS.  Then do any OS upgrades or copy data to a new server.  Then install new version of FMS.  Make sure you have the licensing info and stuff ahead of time.  Are there any plugins?  Think about upgrading them also. 

          • 2. Re: Import Verification/Clone Files/Best Practices for Files
            gdurniak

            Since you seem concerned about corruption, I have collected info here:

             

            http://www.fileshoppe.com/recover.htm

             

            Don't sweat the "clean" clone. If your current file is working,  use that

             

            Your import script could write to a log record,  showing the resulting Found Count in both source, and destination

             

            Also, don't sweat Production vs Development.  My clients don't have the luxury of "Development" versions. They need changes immediately,  daily, and weekly, so we edit "Live"  ( horror of horrors )

             

            Corruption is fickle.  Your Golden Clean Master Development Clone can also be corrupted ( I have seen this happen )

             

            greg

            • 3. Re: Import Verification/Clone Files/Best Practices for Files
              wimdecorte

              gdurniak wrote:

               

               

              Also, don't sweat Production vs Development. My clients don't have the luxury of "Development" versions. They need changes immediately, daily, and weekly, so we edit "Live" ( horror of horrors )

               

              I am going to counter that: do fret over it.

               

              Many things can go wrong when you do live development, mainly the invisible workflow failures because of the schema locks you will be introducing:

              - when you work in define database in a table: that table is locked when you commit, any user trying to create a new record will get error 302, no record will be created and they will end up in the first record of that table.  If you don't trap for that error your script will continue to modify (or delete!) that first record

              - when you make a layout change and commit the change, any user in the process of going to that layout will not go to that layout but will end up on the first layout in the solution.  If you don't trap for that error it is likely that your script is in entirely the wrong context and you may be create/editing/destroying the wrong data

               

              If live development is a must, then the solution needs to be military grade when it comes to error trapping and handling.  Any script step that changes context needs to be checked, any step that creates/edits/deletes need to be error trapped,...

              • 4. Re: Import Verification/Clone Files/Best Practices for Files
                gdurniak

                yes, we edit "live", but off hours ( whenever possible )

                 

                I sometimes miss the old days, when at least "Define Fields" would ask ( first ) if you would like to kick off all Users

                 

                greg

                 

                > I am going to counter that: do fret over it.

                 

                Many things can go wrong when you do live development, mainly the invisible workflow failures because of the schema locks you will be introducing:

                • 5. Re: Import Verification/Clone Files/Best Practices for Files
                  cortical

                  I have taken to building a separate interface export_import file

                  layout for each table

                  script for export, import and checking next serial against current last serial

                  exporting and importing  as merge files

                   

                   

                  speeds things up