This question for is for anyone who has worked on large and complex solutions with high table and record counts.
I am using and loving the data separation model. My solution runs over the wan and is decently performant.
I have about 110 data tables, some small, some large. I am staring down the barrel of another module that will add up to 50 tables with more down the road.
I am considering splitting the data file 2 or more files. My operations team does not need much accounting data and the accountants don't look at operations very much so I could try to split it up by division but inventible 2 data sets are going to need to be accessed once and a while.
The question is how big is too big. My data file does take some time to open over the wan already. Should I start splitting it now?
For this scenario, lets assume that the solution has been developed with performance in mine with minimal unstored calcs and schema in the data file.
Veterans and Pros - Do you have any stories of high tables count solutions that still perform if properly designed?