I am getting ready to kick off a database that will have container fields with large files in them. This files are PDF's and can average 15-30 mb. There will be thousands of these types files stored in the database overtime creating a huge database. (~150 gb - ~500 gb)
I have concerns that as the database grows it will slow over time creating a lethargic process.
We use FM13 and Server 13 which is hosted on an exclusive server with nothing else pulling its resources.
We should have at any point in time between 25-40 client users, 10-15 Go users and 10-20 web users.
This database will be multi-function, managing many different departments at a time pulling core data throughout.
What are the best practices for this type database?
Thanks in advance,