Non-blocking calls to external APIs

hi -

sure this is becoming a more common issue for many people - from within
a web-app how to interact with other systems in a non-blocking way. ie i
dont need to wait for a response from the other system in the cycle of
the same user request/response.

i need to send some logging information off to another system, via
rest/http.
in a rails app we’re currently using delayedJob, which stores the events
to the DB, and another process sends them. this is a pretty heavy
process.

a simplistic way would be just using system “curl &” but that will
spawn a system process. not sure about the risks involved with that -
this means one NgInx/rails instance will have to wait for that to
complete, and maybe run into memory issues.

Are there any ways to use the net/http library to “fire and forget” a
call?
or perhaps a simple implementation using threads, that will be less
resource intensive than “curl &”

thanks…

/dc

On Sat, Mar 27, 2010 at 11:29 PM, David C. [email protected]
wrote:

to the DB, and another process sends them. this is a pretty heavy
resource intensive than “curl &”

thanks…

/dc

Posted via http://www.ruby-forum.com/.

Event Machine sounds like it fits your needs. This presentation says it
can
handle 5-10k concurrent connections in a single ruby process
http://timetobleed.com/eventmachine-scalable-non-blocking-io-in-ruby/

Here is the rubygems page eventmachine | RubyGems.org | your community gem host

On 03/28/2010 07:29 AM, David C. wrote:

i need to send some logging information off to another system, via
rest/http.
in a rails app we’re currently using delayedJob, which stores the events
to the DB, and another process sends them. this is a pretty heavy
process.

Are there any ways to use the net/http library to “fire and forget” a
call?
or perhaps a simple implementation using threads, that will be less
resource intensive than “curl &”

The simplest that comes to mind: stuff your operations in a queue and
have one or more threads process them.

Cheers

robert

David C. wrote:

hi -

sure this is becoming a more common issue for many people - from within
a web-app how to interact with other systems in a non-blocking way. ie i
dont need to wait for a response from the other system in the cycle of
the same user request/response.

i need to send some logging information off to another system, via
rest/http.
in a rails app we’re currently using delayedJob, which stores the events
to the DB, and another process sends them. this is a pretty heavy
process.

a simplistic way would be just using system “curl &” but that will
spawn a system process. not sure about the risks involved with that -
this means one NgInx/rails instance will have to wait for that to
complete, and maybe run into memory issues.

Are there any ways to use the net/http library to “fire and forget” a
call?
or perhaps a simple implementation using threads, that will be less
resource intensive than “curl &”

thanks…

/dc

All the other suggestions are great, and you should give 'em some
consideration.
I’d mention fork() as well, though, since most UNIX-like
distributions(if not all? I’m just not 100%) implement a copy-on-write
fork(), reducing the overhead people sometimes associate with fork().

I’d add as a note that sharing data between two processes isn’t straight
forward, you might need to look at IO.pipe but if you don’t need to send
data back or have data available in your subprocess after a call to
fork() you should be fine.

Don’t forget Process.wait and/or Process.detach to avoid collectin’
zombies.

Thanks,
Rob

David C. wrote:

Are there any ways to use the net/http library to “fire and forget” a
call?

Thread.new do
Net::HTTP.whatever…
end

On Sunday 28 March 2010 03:31:54 am Robert G. wrote:

I’d mention fork() as well, though, since most UNIX-like
distributions(if not all? I’m just not 100%) implement a copy-on-write
fork(), reducing the overhead people sometimes associate with fork().

It has issues with Ruby, though – any time Ruby’s garbage collector
runs,
pretty much the entire process will have bits flipped in it. So it’s
copy on
write, or copy everything on GC.

There have been some efforts to make a COW-friendly GC for Ruby, but
nothing I
know of that works on 1.9.

David M. wrote:

On Sunday 28 March 2010 03:31:54 am Robert G. wrote:

I’d mention fork() as well, though, since most UNIX-like
distributions(if not all? I’m just not 100%) implement a copy-on-write
fork(), reducing the overhead people sometimes associate with fork().

It has issues with Ruby, though – any time Ruby’s garbage collector
runs,
pretty much the entire process will have bits flipped in it. So it’s
copy on
write, or copy everything on GC.

There have been some efforts to make a COW-friendly GC for Ruby, but
nothing I
know of that works on 1.9.

Thanks for the information! I hadn’t a clue about that.

David M. wrote:

On Sunday 28 March 2010 03:31:54 am Robert G. wrote:

I’d mention fork() as well, though, since most UNIX-like
distributions(if not all? I’m just not 100%) implement a copy-on-write
fork(), reducing the overhead people sometimes associate with fork().

It has issues with Ruby, though – any time Ruby’s garbage collector
runs,
pretty much the entire process will have bits flipped in it. So it’s
copy on
write, or copy everything on GC.

There have been some efforts to make a COW-friendly GC for Ruby, but
nothing I
know of that works on 1.9.

I did some rooting around on google and it looks like Hongli has some
done some work making the ruby garbage collector COW-friendly. A patch
for 1.8.6 is available here:

http://izumi.plan99.net/blog/index.php/2008/01/14/making-ruby’s-garbage-collector-copy-on-write-friendly-part-7/

I’m not sure if he has merged his work into Ruby Enterprise Edition, but
there was some talk of getting it merged into Ruby 19. It looks like it
was turned for performance reasons.

Thanks,
Rob

Robert G. wrote:

I’m not sure if he has merged his work into Ruby Enterprise Edition

REE does have COW-friendly garbage collection. I don’t know where the
code originated though.