Many apps == Many processes?

Hi

I’m running several smallish apps each with small audiences and few
users, all on the same server. We’re currently deploying on
Apache+FastCGI with few problems, except the fact that each app starts
a new dedicated Ruby process. The combined performance demand on the
server shouldn’t be very high, but since the amount of apps is in the
thirties and since each and every one uses its own Ruby process, this
results in quite a bit of overhead.

My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?

Best regards,
Tomas J., Sweden

Tomas J. wrote:

Hi

I’m running several smallish apps each with small audiences and few
users, all on the same server. We’re currently deploying on
Apache+FastCGI with few problems, except the fact that each app starts
a new dedicated Ruby process. The combined performance demand on the
server shouldn’t be very high, but since the amount of apps is in the
thirties and since each and every one uses its own Ruby process, this
results in quite a bit of overhead.

My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?

Best regards,
Tomas J., Sweden

this part is pretty hard. I just ran into the same situation and i
still havent found an answer.

I was able to lessen the load.

In lighttpd you can set the number of FCGI processes per app. I set
this to 1 for each app and was able to push the server (Athlon XP 2600,
1.5GB RAM) to about 90 sites with decent traffic. i think you can do
this with Apache also.

The problem is that each app needs its own FCGI process.

With the current Capistrano type deployment, you have one app with many
users. This type of installation is much easier, because all of your
users go to one app, but in your case, you would need to separate them
and that’s when stuff gets a little crazy.

Sorry,

–jake

Yes, there is. Take a look at LiteSpeed. It creates processes as
necessary, that is, dynamically.

/ Peter

Right now I don’t think there is. You have to run at least 1 process per
Rails app. This is true with FCGI and Mongrel. CGI runs and completes
its
process after Rails does its business (I believe)

The only hope for situations like this might be the jRuby project.

On 12/28/06, Tomas J. [email protected] wrote:

results in quite a bit of overhead.

My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?

Best regards,
Tomas J., Sweden

seth at subimage interactive

http://www.subimage.com
http://sublog.subimage.com

http://dev.subimage.com/projects/substruct

That’s what I was afraid of…

Peter, I think you misunderstood the question; I believe LiteSpeed too
creates one process per app (key word: per app) and THEN creates more
as necessary as per your description. If you, indeed, didn’t
misunderstand, and isn’t mistaking, I’d love it if you could confirm
it.

However, I don’t think it’s possible, because I suspect this is an
issue with Rails itself, not the various dispatching solutions, since
the processes cache the model classes in memory, how could they
possibly contain several apps per process? Severe changes to Rails
itself would be necessary for a single Ruby process to be used by
multiple Rails apps at once
. Again, I’m only almost certain of this,
if somebody with definite knowledge on the subject can confirm or deny
this, I’d really appreciate it.

Does anyone know if this issue is being investigated at all, if anyone
has addressed this concern before? As it stands now, Rails is an
excellent choice if you only run one or a couple of big apps on the
same server… but for many smallish apps, its barely useable, which
IMHO is a damn shame.

Best regards,
Tomas J.

On Dec 28, 2006, at 3:58 PM, subimage interactive wrote:

Right now I don’t think there is. You have to run at least 1
process per Rails app. This is true with FCGI and Mongrel. CGI runs
and completes its process after Rails does its business (I believe)

The only hope for situations like this might be the jRuby project.

Headius: The Beginning of JRuby on Rails

_why has been doing some interesting work with sandboxing in the ruby
interpreter that may help with this someday too:

http://redhanded.hobix.com/inspect/
theThrillingFreakyFreakySandboxHack.html

-JD Harrington

What does the development speed have to do with being able to run
multiple apps per process, or not being able to? I fail to see how the
two have any relation what so ever.

On Dec 28, 2006, at 4:58 PM, Tomas J. wrote:

Does anyone know if this issue is being investigated at all, if anyone
has addressed this concern before? As it stands now, Rails is an
excellent choice if you only run one or a couple of big apps on the
same server… but for many smallish apps, its barely useable, which
IMHO is a damn shame.

It’s only a usability problem with respect to resources, and the only
way to measure that is against the entire pool of resources required
to create and maintain the application as well.

For instance, let’s say that Ruby is 16 times as expensive in terms
of memory as something else. What is the cost of that memory compared
to the cost advantage (if any!) of development speed and maintainability
of the application itself?

Memory is not expensive these days, and it’s getting less expensive
each and every day.


– Tom M., CTO
– Engine Y., Ruby on Rails Hosting
– Reliability, Ease of Use, Scalability
– (866) 518-YARD (9273)

So, in what way are they mutually exclusive? How would it negatively
impact he development speed if Rails apps could share processes?

I’m not a newbie to Rails, I’ve been using it for over two years and
I’m quite familiar with the productivity gains. Nowhere have I claimed
otherwise. I’m just having a performance problem even though I have
small audiences and few users. If apps could share processes, it would
decrease costs significantly, since each server could be used with
less overhead.

Tomas, did you consider trying out mod_ruby?

Andre

Tomas J. wrote:

What does the development speed have to do with being able to run
multiple apps per process, or not being able to? I fail to see how the
two have any relation what so ever.

Both contribute to the total cost of ownership of the application.

Rails may use more memory than other web technologies, but memory is
cheap - if Rails saves development and maintenance effort, and gets you
to market faster, that’s an overall win.

regards

Justin F.

Tomas J. wrote:

So, in what way are they mutually exclusive? How would it negatively
impact he development speed if Rails apps could share processes?

In principle they are orthogonal. In practice, they are constrained by
the way Rails is at present.

Nobody has suggested that it would be a bad thing if a process could
host more than one Rails application - Dave T. has suggested it
would be good for Rails to have a ‘container’ that applications could be
deployed into, and Why’s sandbox appears to be a step in that direction.

I’m not a newbie to Rails, I’ve been using it for over two years and
I’m quite familiar with the productivity gains. Nowhere have I claimed
otherwise. I’m just having a performance problem even though I have
small audiences and few users. If apps could share processes, it would
decrease costs significantly, since each server could be used with
less overhead.

I picked up on the memory aspect, because it is frequently discussed and
because Tom gave memory use as an example of resource use - but you seem
to be saying that you have performance problems relating to the number
of processes running, before memory becomes a problem. Can you say more
about this?

regards

Justin

I’m using it for internal applications here and it’s working fine.
Version 1.2.6 has a RailsDispatcher that addresses the problems that
occurred with the shared interpreters in previous versions.

I still haven’t made it available for our customers, but it might be
worth giving it a try, since at least in theory it should do what you
need.

Regards,
Andre

of processes running, before memory becomes a problem. Can you say more
about this?

I’m the same guy. I’m asking questions related to how to improve the
per-server performance of Rails. Since I’m running several apps, the
server starts a new Ruby process for each app. This means I have
thirty or more Ruby processes running. However, just one or two
processes would suffice to handle the combined load. That means the
server uses 15 to 30 times more processes than necessary, simply
because of how Rails internals work. If Rails apps could share
processes, I would get 15-30 times more performance or more PER
SERVER. That’s one of them… “limitations”, to misquote Bush.

So I’m here to ask if anyone else has run into this particular problem
of running many small apps and having the performance sucked out of
the server NOT due to overwhelming amounts of visitors, but because
apps can’t share processes.

That’s all. I’m already aware of the productivity gains of using
Rails, have been for over two years.

Is mod_ruby even a viable alternative? I’ve never heard of anyone
using it for RoR? Actually I did think about it, but wrote it off
based on what I just said. Was that in haste?

That’s very interesting. I just read a blog post by Shugo regarding
the topic:
http://blog.shugo.net/articles/2005/08/03/running-rails-on-mod_ruby

Running Rails on mod_ruby, does it come with any pitfalls or
limitations?

Regards,
Tomas

Yes, I decided to try it after reading Shugo’s blog.

The only pitfall I found so far is that everything in your application
is “packed” into the Apache::RailsDispatcher::CURRENT_MODULE module. I
had some code that depended on the the class of a given object to
choose the action it would take, and I had to fix it by using
“self.class.to_s.demodulize” instead of just “self.class.to_s” to get
just “Foo” instead of “Apache::RailsDispatcher::CURRENT_MODULE::Foo”.
It’s no big issue after you realize what’s going on, I guess.

Andre

Thanks, that’s very interesting. I’ll have to investigate this further.

Best regards,
Tomas J.

I wasn’t suggesting that they are mutually exclusive or that I wouldn’t
like the situation to be different.

But, right now, they are mutually exclusive: There’s no way to use
Rails (and therefore gain it’s benefits) without incurring this cost.


– Tom M., CTO
– Engine Y., Ruby on Rails Hosting
– Reliability, Ease of Use, Scalability
– (866) 518-YARD (9273)

Tomas J. wrote:

to be saying that you have performance problems relating to the number
of processes running, before memory becomes a problem. Can you say more
about this?

I’m the same guy.

Tom = Tom M., the person whose answer caused you to ask why he
brought up the issue of speed of development.

I’m asking questions related to how to improve the
per-server performance of Rails. Since I’m running several apps, the
server starts a new Ruby process for each app. This means I have
thirty or more Ruby processes running. However, just one or two
processes would suffice to handle the combined load. That means the
server uses 15 to 30 times more processes than necessary, simply
because of how Rails internals work. If Rails apps could share
processes, I would get 15-30 times more performance or more PER
SERVER. That’s one of them… “limitations”, to misquote Bush.

You would get the same performance with much less memory use.

Having 30 idle processes should nave negligible impact on performance,
unless their memory use is causing paging or swapping.

So I’m here to ask if anyone else has run into this particular problem
of running many small apps and having the performance sucked out of
the server NOT due to overwhelming amounts of visitors, but because
apps can’t share processes.

That’s all. I’m already aware of the productivity gains of using
Rails, have been for over two years.

Has anyone suggested you are not?

regards

Justin