Best way to start new ruby script?

I have a long running process that I want to start from a Rails request.
But, because it takes so long to execute, I want to start it in a
separate process and then return. What is the best way to do that?

Thanks

/Marcus

2006/3/8, Marcus A. [email protected]:

I have a long running process that I want to start from a Rails request.
But, because it takes so long to execute, I want to start it in a
separate process and then return. What is the best way to do that?

You have several options - at leas these:

  • do it in a thread

  • create a new process using fork’s block form, which executes the
    block in the child

  • create a completely new process via a standard fork call

Which one is best depends on the problem you are trying to solve, # of
CPU’s available, personal taste etc.

Kind regards

robert

Robert K. wrote:

block in the child

  • create a completely new process via a standard fork call

Which one is best depends on the problem you are trying to solve, # of
CPU’s available, personal taste etc.

Portably, the following works, as long as you don’t mind not having
access to its stdin/out:

t = Thread.new { system “something” }

You can use t.value to wait for the process to finish and determine the
result code of the system call (true or false).

If you want to read/write to the process look into the various pipe
openers: Kernel#open("|something"), Open3, etc.

Keep in mind that rails and fork don’t get along very well. If you
just plain fork like you wold in a normal ruby scrip[t you will get
the Mysql has gone away error and your db connection will be borken.
Here is a little hack that will let you get away with forking without
losing your db connection in rails:

fork do
LongProcess.do_something(that_takes_a_long_time)
Kernel.exec “echo -n”
end

That last Kernel.exec "echo -n" is the hack that will keep your db

connection from “going away”

Cheers-
-Ezra

On Thu, 9 Mar 2006, Ezra Z. wrote:

Keep in mind that rails and fork don’t get along very well. If you
just plain fork like you wold in a normal ruby scrip[t you will get
the Mysql has gone away error and your db connection will be borken.
Here is a little hack that will let you get away with forking without
losing your db connection in rails:

fork do

all open file handles duped

LongProcess.do_something(that_takes_a_long_time)
Kernel.exec “echo -n”

all duped file handled flushed and closed!

end

That last Kernel.exec “echo -n” is the hack that will keep your db
connection from “going away”

Cheers-
-Ezra

also, forking while in a db transaction is bad idea. not to mention
fastcgi…

best to start an external job runner daemon and use it via drb. one is
included in the rq source but it’s a bit bundled… i coded it for
exactly
the reasons the op has expressed.

kind regards.

-a

[email protected] skrev:

best to start an external job runner daemon and use it via drb. one is
included in the rq source but it’s a bit bundled… i coded it for exactly
the reasons the op has expressed.

I ended up doing a simple drb service external to the Rails application.
I didn’t want to do it at first since I’ve been doing similar things
with Java RMI…

It starts a new thread (apart from the threads drb starts itself) on
every request in order to return directly. Works good. It was extremely
simple code. Now I only have to make it a deamon/service.

/Marcus

Do it in a thread seems like the simplest solution but I don’t know if
it works in this setup with Rails and SCGI (Rails may be performing
clean up on resources that the thread uses when the request returns
while the thread is still running)

I just want to be able to invoke the external script in an async manner
(passing along a parameter as well somehow) and also so that it doesn’t
interfere with Rails way of doing threads (or not doing threads…).

Thanks for the answer.

/Marcus

Robert K. skrev: