Some FileMaker shops run like the wild west, full of villains and vigilantes. The sherif should be able to set some basic rules to help everyone get along.
The current unregulated approach to server resources means that any one client/process can hammer the server with expensive work (disk or CPU). It is conceivable that an expensive task could take seconds, minutes, or even hours. The perceived performance for all other clients suffer and they might even behold the dreaded beachball. If that resource hog happens to be a real villain, then we would call this a denial of service attack.
One feature that might help would be throttling resources. This would be done in global settings configured on the server's console. Such as...
Limit a single client's database engine resources to:
__90%_ CPU utilization
__200_ i/o operations per second (per volume?)
_50.0_ MB transferred per second
Limit a single client's PSoS resources to:
___20_ scripts executed per second
____5_ concurrent executing scripts
Blacklist an IP address for __5_ minutes after _10_ failed logins.
Obviously a blocking operation that locks out other users from a table/record should run at full speed until it is completed. Keeping a scoreboard for each client connection means that the next request could be delayed.
Some might also like to see resource limits for web direct. I imagine that can be done already in Apache or at a firewall, but it would probably be most convenient to include it as well.