Hi all,
Ive recently been trying to figure out a way to make a long running
background task scale accross all CPUs in Ruby on Rails, since
multi-processing (not multi-threading) seems like the way to go.
A theoretical example would be:
code in controller
def do_something
@mydata = Hash.new
# Start 4 background processes
# and start adding data to @mydata
# Page loads while the processes are working, displaying
# something like "We are currently processing your data"
end
Question is how to handle reading/writing to @mydata safely because 4
processes would be trying to read/write to it? What is the best way to
approach this problem?
Thanks,
Petr
I wouldn’t run a long running process directly from your controller,
since rails first renders all stuff that should be sent to the client
(html/xml/js/etc) before it actually sends it. This means that the
complete controller action should be finished to show something in your
browser.
I think that the best way to go is to use something that is able to
spawn background processes (like Starling).
I don’t think multiple processes can access the same Hash. Multiple
threads should be able to do this. Maybe a dirty solution is to use the
database for this and do pessimistic locking on a record.
Wouter de Bie wrote:
I wouldn’t run a long running process directly from your controller,
since rails first renders all stuff that should be sent to the client
(html/xml/js/etc) before it actually sends it. This means that the
complete controller action should be finished to show something in your
browser.
I think that the best way to go is to use something that is able to
spawn background processes (like Starling).
I’m aware of different solutions to run a background job. However, the
problem is how to make 4 background jobs MPSAFE, all of them working on
the same Hash. Any ideas?
Petr