6 Replies Latest reply on Nov 20, 2012 5:33 AM by filemakeris.us

    Size (in Bytes) of each table

    filemakeris.us

      Hello, I have an issue with a database, I need to know the size of each table. It went from 2GB to 4GB in a week and I have no clue what table is the culprit.

       

      I tried doing a copy compressed and there is no change.

       

      Thanks!

      DD

        • 1. Re: Size (in Bytes) of each table
          melastudio

          I remember very high limit (over 2 gb per container field) and height limits for text filed...

          Every release this limit are even more restrictive so the limit of the files (more than of a table) is to the proximity of your hard disk capacity.

           

          If you check in the tech notes of your versions you should find all the number you're asking...

           

          But if your database grows such a away there is something else that's wrong.

           

          You probably have problem with backup and integrity of the database. If you use images and container and files you should look to the new features of fm12 absolute remote container; if fm11 you should split the images in another files or using a reference...

           

          How are you come in that situation?

          • 2. Re: Size (in Bytes) of each table
            filemakeris.us

            Hi, the file seems to be OK and healthy. I'm afraid some user might have stored something inside a container. I would like to know the datasize per table. Is is possible to?

             

            DD

            • 3. Re: Size (in Bytes) of each table
              jbante

              Have you tried deleting tables from a copy of the file and seeing what impact it has on the total file size?

              • 4. Re: Size (in Bytes) of each table
                datastride

                DD,

                 

                If you are primarily concerned about the size of the data which might have been incorporated into some container fields, perhaps you should just focus on that issue (instead of table size). What kind of data is stored in your container fields? If you are importing mostly images and if you are using FileMaker 12, you could write a script to read all records in any table with one or more container fields and use the new GetHeight( field) and GetWidth( field ) to get sizes in pixels, then multiply these two values, and report any result that exceeds some threshold you define. This way you could find any very large images that have been imported.

                 

                Or, for a quick and dirty (and brute force) approach, write a script to read all records in any table with one or more container fields and export each to a single folder on your hard drive. Make up a name for each exported file that consists of the table name plus primary key field (so you'll know where each exported item came from). Then view the contents of the folder into which you exported the container objects as a list and sort by file size. You can review the list to see from which table and record any very large item has come from. Not pretty at all, but you can just delete the folder when you're done.

                 

                Or you could use a plug in like ScriptMaster or Troi File, along with a scripted import triggered by a button on each layout, to determine the size of an object to be imported into the database BEFORE you actually import it. You could then disallow the import of anything that was larger than a threshold you define.

                 

                Just some ideas to get you thinking about the alternatives. I hope this is helpful ...

                • 5. Re: Size (in Bytes) of each table
                  mbraendle

                  You can use a calculation field with the formula Length(your_container_field) to get the size. Then sort the records by this calculation field.

                  • 6. Re: Size (in Bytes) of each table
                    filemakeris.us

                    Thanks guy. I'm gonna do a export script for each table until I found where's the issue. The DB is growing at a rate of 1 GB per week !!!

                     

                    There's something funky going on!