1) there is no one "best way" but do keep your design to a single database shared by all departments. Your post clearly describes data that needs to be available to more than one such department. Note that a "database" can consist of one database file with many tables or can be made of many files. There are trade offs inherent to each approach. The data modeling--deciding how many tables to define and what fields should be defined in each, is critical to getting a good database design and requires knowledge not available to people responding to your post.
3) you are more likely to run out of hard drive space before your database will be "too big" for fileamaker, but very large data sets (more than 10,000 records say), require carefull handling and there does come a point (with many more records than that) where other database systems better designed to handle such large amounts of data should be evaluated as a better option.
Thanks for your response
So if I am understanding things correctly, the 'best' approach is to have one database, different tables/files covering the range of information and that would work best.
Are you also saying that it is too big a project for a newbie?
I cannot judge from here whether or not this project is too big for you. I don't know your capabilities and I don't have a full understanding of the scope of your project.
I can suggest that you start small, Saving Sequential Back Ups During Development, add new features, modules a bit at a time with lots of testing and be prepared to modify your design--even if you have to discard days of work, if you figure out a better approach to use after the fact.
And i can also strongly, recommend that you acquire FileMaker Advanced if you have not already done so. The script debugger and data viewer tools alone will save you many hours of effort when a part of your database doesn't work as expected.
Thank you very much, I wll take all that into consideration
Cheers for your help
Did you mean 10,000,000 records perhaps? Or 100m? 10,000 seems pretty small"
Any number we can use is pretty vague as there are many more variables than just the total number of records.
Note that I am NOT saying you should change database apps if you have more than 10,000 records in a table. What I AM saying is that by the time your record counts start getting that large, sorts and finds on unstored/unindexed fields, aggregate/summary value calcs over the entire table start generating noticeable delays and you have start employing scripts and other tools to help avoid/minimize those delays. I have one File where a LineItems table exceeds 2 million total records. It works just fine, but there are certain reports in that file where I am very careful not to do a Show All Records during regular business hours...
In reading over the responses, one point I would consider. It is the question of access to the databases. Are the divisions geographically separated? If so would response time be improved if the divisions are accessing locally and not remotely?
A very good point. My assumption had been departments all on the same LAN, but that is purely an assumption.