TongKe Xue wrote:
I think there is a misunderstanding.
My goal is not to enforce these limits at compile time.
My goal is to enforece these limits at run time.
You exceed the memory limit? I don’t allow you to allocate more memory.
You exceed your cpu slice? I swap you out and run another thread for a
As another poster pointed out, this is an operating-system-dependent
function, not one that belongs in a portable high-level language. So you
now need a system administrator to detect when a process is being a CPU
or memory hog and take some kind of action.
This is exactly what an OS does for a living! I don’t know how to do
this on a Windows server, but on Linux and most other Unix-like systems,
there is a gizmo called “ulimit”. This allows you to set a policy on how
big a process can get or how much total CPU time it can accumulate. The
bad news is that, as far as I know, when the process does violate its
ulimit, it is unceremoniously terminated and expunged from the system.
If that kind of behavior is compatible with your game, great. Otherwise,
you’ll have to find a way to trap the ulimits and deal with them.
In any event, I think you’re making this hard for yourself. There are
existing simulation game frameworks that work very well, and I think
there are even some with Ruby bindings. It sounds to me like you’re
trying to reinvent some wheels rather than designing a user experience.
I’d recommend getting a good handle on the “whats” first before delving
much deeper into the “hows”.