Hi, what you described is very attractive. It can not be excluded that the development version is optimized... Thanks again.
assuming that it's the same version (13.09?) - what, if You lower the cache to the default value?
I never tested Pro against Pro Advanced - I never use a Pro when there is an Adv available (an I only have VM for Windows).. will set a reminder now, will test tomorrow on a site where both version are installed, I got a test file that writes benchmark results there..
I consistently saw Adv outperforming Pro with an average 40% in milliseconds.
Therefor this version is called "Advanced"
Sorry for the poor joke.
Never thought of a speed comparison test this way. Wonder what reason this might have.
Have you compared the runtime too ?
Trick question? To create runtimes requires advances.
Advance has added features such as the ability to create runtimes, custom functions and debugger. I would not think there would be a speed difference, and if there were I would expect advance to be slower because of the added features. I would like to see more test to verify siplus results. Maybe a conflict of both application being on the same computer. It would be nice to see testing on two different identical computers except one with FMPA and the other with FMP.
Here is the comparison of FMP & FMPA. Advance has added features, nothing that should cause a difference of speed related to running the database. Features that will speed up development and help track down bugs.
siplus...are you seeing any variances comparing Advanced to Advanced? Or Pro vs Pro? While I wasn't testing Advanced vs Pro...in ExecuteSQL tests, the performance was all over the place, even with the same data set and calculations.
Especially when you add in Aggregate functions, or Group by. I was seeing a 25% - 45% variance in performance. It's not something you normally would notice, because you hide the lag in places where users don't notice it. But I definitely saw a variance in the ExecuteSQL itself.
Why does it interest us ? Because we develop in Advanced but the users we are selling our products to are using the normal version. So what during developing + testing (with advanced) appears acceptable becomes unacceptable at the client's place when using the Pro version instead of the Pro Advanced version.
There are no variances Pro - Pro or Advanced-Advanced. Only Pro - Advanced.
That is a weird problem. I'll have to do some performance testing to see if we are seeing the same thing.
Can you post your ExecuteSQL or something similar so I can try and run the same query?
tested it this morning - forgot, that V11 is on that site )-:
no differencies between Pro and Advanced, quite a few (simple) sql statements in the script I tested
That is what I would expect. Interesting to see if Joshua get the same results.
I've also run some testing and *not* found a difference between FMP v13 and FMPA v13 (Windows), at least given the following parameters.
Environment: FMS v12. OS: Windows 7. FMP v13.09, 32 bit FMPA v13.09 32 bit.
This is on a large network. I tested a couple of our solutions. One is very structurally simple but text heavy, with around 900k records with a number of indexed fields (some with several thousand characters). QFs and Finds - all very good. I also used another one - a photo library so lots of images. Also pretty good. No SQL statements involved.
I did get variations along the way but I think it was always related to entirely unpredictable network traffic. I would open one version, open remote to access the solution, run a single test, Quit. I would then repeat this process on the other version. I ran through a number of tests. As I said, I think any variation I found was related to network, not app version.