I have done something similar with around 140.000 images and 30gig.
Where I host the file om FMS14 with the images secure externally stored.
a server sided script imports the images into the database.
In my case the images come from a scan folder where I read the list of files in that folder using the BaseElements plugin.
I then use a looping script with the Insert from URL step to pull the images into the database.
afterwards I delete the images from the scan folder using the BaseElements plugin.
Hope that helps
#1 - sounds like you want your filemaker server to do double duty as a fileserver. That's OK! Look into "external secure storage" of your container data. Filemaker manages the links to the data, and it's stored in a location that's secure and doesn't count against your file size for your database. Also, make sure your server has a drive big enough to handle that load.
#2 - That isn't necessarily true. There are things like Insert from URL and plugins that retrieve folder listings that make it a bit more possible. While a robot machine running a timer script may work as well, what I've normally used in this case has been a windows server performing the following actions:
a) A system scheduled task is run that copies all of the data I need using a .bat dos script running robocopy. This allows me to copy files to the filemaker server documents folder where they will be more accessible. My batch script has specifications to only pull documents modified the last x number of days, and to auto-clear any documents older than x number of days from my filemaker documents folder. Basically, assembling all the data into a single manageable location.
b) Using the Get(DocumentsPathListing) function, I can retrieve a list (on the server too!) of all the documents in the documents folder. I store that in a variable in my filemaker scheduled script. EG $list
Look at the example on that page, it shows you what is returned to filemaker server.
c) Using a scheduled script, I loop through the $list and pull the documents into filemaker:
Set Variable [ $list ; Get(DocumentsPathListing) ]
Go To Layout [ documents ]
Set Variable [ $i ; $i + 1 ]
Set Variable [ $iDoc ; GetValue ( $list ; $i ) ]
Perform Find [ match for $iDoc ]
If [ get(foundcount) = 0 ]
Insert From URL [ documents::container ; $iDoc ]
Exit Loop If [ $i = ValueCount($list)
Of course your script will differ depending on what you want to do with the documents, but that's at least a starting point to look into.
I have something similar running with a few hundred images per month for a number of years. I keep all of the images in internal containers. When someone needs an image the simply download it to their client machine so they can process it. When they are complete they drop it back into the container. This works out well as I can set an image as "in process" by a certain user and not allow processing by anyone else until it is released. The database size limit is 8TB. At 100GB You are still no where close to that. If you use RAID SSD or PCIe storage your file access and backups will be pretty quick compared to HDD.
You would need 60+ FM clients though. Not sure if that is part of your plan.
Getting all the images into FM can be done with a client running a script to import the folders. After that the storage structure is built into FM. Depending on what you need the images an be exported to folders as needed and then moved.