We’re starting to notice a pattern in our Rails app that there are
asynchronous actions we’d like to take after the response is rendered
and shipped to the browser. Examples including queuing email and
writing statistics to the database.
It would be cool if there were a hook in Rails that would allow us to
throw this work onto a queue and have it done in the same process but
just after the response is sent back to the browser.
Anyone know of a hook like this, or a plugin that creates one?
-Colin
Hi Colin,
Colin Kelley wrote:
Anyone know of a hook like this, or a plugin that creates one?
If I understand your question correctly, you’re looking for a
distributed
Ruby library / plugin. The standard library includes dRuby. The
choices
re: plugins includes BackgrounDRb which was the standard choice for
quite a
while. There are other options available today which may suit your
needs
better, depending. Google ‘distributed ruby plugins’.
WRT doing the work in the same process that handled the request, the
Rails
architecture pretty much precludes that, at least as a ‘best practice.’
HTH,
Bill
Colin Kelley wrote:
Thanks Bill for the quick response.
Distributed Ruby is pretty heavyweight for what I have in mind. It has
a lot of moving parts. I was thinking lightweight that would just allow
me to time shift a few operations that can take a second or two so that
the page is returned first and then those operations are then performed.
(Example: queue an email.) I’d like to have access to the full cached
model when those post_processing operations run.
BackgrounDRb gives you access to your app’s models. I haven’t used any
of
the available alternatives, but I’d expect them to do the same.
end
@@background_tasks.)
The timers supervision is to preclude them taking too much time away
from serving the next pages. And it would be best to not allow more
than one post_processing mongrel at a time for the same session (or
possibly IP:port), lest a runaway Ajax loop consume all the mongrels for
example.
Does this sound feasible? Anyone know of some work along these lines
already?
I would absolutely recommend against thsi approach. You’re adding at
least
three levels of complexity when one would do. And the one’s you’re
contemplating are dangerous. Using a distributed process like
BackgrounDRb
solves your problem with a tested and supported solution. I was using
the
‘old’ one, primarily because it also ran on Windows. And I didn’t find
it
‘heavyweight’ at all. I’d strongly recommend you try out the existing
solutions to this very common problem before you go hacking Rails’ core.
Best regards,
Bill
Bill W. wrote:
BackgrounDRb gives you access to your app’s models.
Excellent–I wasn’t aware of that.
I’d strongly recommend you try out the existing
solutions to this very common problem before you go hacking Rails’ core.
I will follow your advice and try out BackgrounDRb and perhaps one of
the alternatives as well.
Thanks again,
-Colin
Bill W. wrote:
If I understand your question correctly, you’re looking for a
distributed
Ruby library / plugin. The standard library includes dRuby.
…
WRT doing the work in the same process that handled the request, the
Rails
architecture pretty much precludes that, at least as a ‘best practice.’
Thanks Bill for the quick response.
Distributed Ruby is pretty heavyweight for what I have in mind. It has
a lot of moving parts. I was thinking lightweight that would just allow
me to time shift a few operations that can take a second or two so that
the page is returned first and then those operations are then performed.
(Example: queue an email.) I’d like to have access to the full cached
model when those post_processing operations run.
I poked around the source a little today and it seems feasible to patch
in at the RailsHandler::process(request, response) level. That method
could be aliased and replaced with a new process method:
alias :old_process :process
def process(request, response)
old_process(request, response)
post_process
end
where post_process was something like
within_timer_block do
while task = @@background_tasks.shift
task.call
end
end
(The controllers would queue their slow operations onto
@@background_tasks.)
The timers supervision is to preclude them taking too much time away
from serving the next pages. And it would be best to not allow more
than one post_processing mongrel at a time for the same session (or
possibly IP:port), lest a runaway Ajax loop consume all the mongrels for
example.
Does this sound feasible? Anyone know of some work along these lines
already?
The Rails Wiki lists all possibilities -
http://wiki.rubyonrails.org/rails/pages/HowToRunBackgroundJobsInRails
The most simple (and low-level) solution I found was to use script/
runner
(from
http://www.ahabman.com/blog/2008/05/rails-background-process-simple-fast-easy/)
My_Controller
def in_background
# your normal code here
system " RAILS_ENV=#{RAILS_ENV} ruby #{RAILS_ROOT}/script/
runner ‘MyModel.some_method_to_run_in_background’ & ”
end
end