3 Replies Latest reply on Feb 3, 2016 9:22 PM by bigtom

    Image database query


      Hi there


      Long time stalker first time poster.


      I am using Filemaker Server 13 and Filemaker Pro 13


      I was looking for some advice on creating a filemaker image database that is hosted on the server. The database needs to update automatically.


      The image database will be used across the company, potentially 60+ concurrent users.


      Basically we will have about 500 images trickling through in various stages in any given month. These will be automatically processed and saved in various locations across the network.


      I have a few questions that i would really appreciate some help with


      1) Should the images be hosted on the Filemaker Server or elsewhere? (there may be 100GiGs + of data and over 100,000 images)

      2) Given that the server can't import folders on a scheduled script, what is the best way to automate image imports into the database. I would like to have no human interaction in this process. Would it be setting up the file with an on timer script/ or using a file as a bot? How would this script work? would i need to create a log of what images have been processed so filemaker knows what to import?


      Thanks very much for your time



        • 1. Re: Image database query

          Hi Dom,


          I have done something similar with around 140.000 images and 30gig.

          Where I host the file om FMS14 with the images secure externally stored.


          a server sided script imports the images into the database.

          In my case the images come from a scan folder where I read the list of files in that folder using the BaseElements plugin.

          I then use a looping script with the Insert from URL step to pull the images into the database.

          afterwards I delete the images from the scan folder using the BaseElements plugin.


          Hope that helps

          • 2. Re: Image database query

            #1 - sounds like you want your filemaker server to do double duty as a fileserver. That's OK! Look into "external secure storage" of your container data. Filemaker manages the links to the data, and it's stored in a location that's secure and doesn't count against your file size for your database. Also, make sure your server has a drive big enough to handle that load.


            #2 - That isn't necessarily true. There are things like Insert from URL and plugins that retrieve folder listings that make it a bit more possible. While a robot machine running a timer script may work as well, what I've normally used in this case has been a windows server performing the following actions:


            a) A system scheduled task is run that copies all of the data I need using a .bat dos script running robocopy. This allows me to copy files to the filemaker server documents folder where they will be more accessible. My batch script has specifications to only pull documents modified the last x number of days, and to auto-clear any documents older than x number of days from my filemaker documents folder. Basically, assembling all the data into a single manageable location.


            b) Using the Get(DocumentsPathListing) function, I can retrieve a list (on the server too!) of all the documents in the documents folder. I store that in a variable in my filemaker scheduled script. EG $list


            Look at the example on that page, it shows you what is returned to filemaker server.


            c) Using a scheduled script, I loop through the $list and pull the documents into filemaker:

            Set Variable [ $list ; Get(DocumentsPathListing) ]

            Go To Layout [ documents ]


              Set Variable [ $i ; $i + 1 ]

              Set Variable [ $iDoc ; GetValue ( $list ; $i ) ]

              Perform Find [ match for $iDoc ]

              If [ get(foundcount) = 0 ]

                  New Record

                  Insert From URL [ documents::container ; $iDoc ]

              End If

              Exit Loop If [ $i = ValueCount($list)

            End Loop


            Of course your script will differ depending on what you want to do with the documents, but that's at least a starting point to look into.

            • 3. Re: Image database query

              I have something similar running with a few hundred images per month for a number of years. I keep all of the images in internal containers. When someone needs an image the simply download it to their client machine so they can process it. When they are complete they drop it back into the container. This works out well as I can set an image as "in process" by a certain user and not allow processing by anyone else until it is released. The database size limit is 8TB. At 100GB You are still no where close to that. If you use RAID SSD or PCIe storage your file access and backups will be pretty quick compared to HDD.


              You would need 60+ FM clients though. Not sure if that is part of your plan.


              Getting all the images into FM can be done with a client running a script to import the folders. After that the storage structure is built into FM. Depending on what you need the images an be exported to folders as needed and then moved.