Streaming stdout with open4 to prevent parent/child buffer lock

i have to fork a process and read a large amount of stdout; however, the
child buffer gets filled up and everything hangs. how do you read
stdout as stream?

the only way i know how to do this is the following two ways; but both
do not work; how can you get the stdout when the method starts so you
can read stdout as it get written to from the child process?

if not, what is an alternative?

below is two sample scripts to show the use case.

#######################################
open4_test.rb
#######################################

command = ‘ruby test_run.rb’

#try it as return
pid, stdin, stdout, stderr = Open4::popen4(command)

while !stdout.eof? do
puts stdout.gets
end

#try it with a block
status = Open4::popen4(command) do |pid, stdin, stdout, stderr|
puts stdout.gets if !stdout.eof?
end

###################################

########################
test_run.rb is
########################

(0…100).each do |i|
puts “helloworld #{i}”
sleep 1
puts “goodbye world #{i}”
sleep 1
end

#######################

Subject: streaming stdout with open4 to prevent parent/child buffer lock
Date: Fri 02 Nov 12 06:06:52AM +0900

Quoting Matthew P. ([email protected]):

i have to fork a process and read a large amount of stdout; however, the
child buffer gets filled up and everything hangs. how do you read
stdout as stream?

I am not very sure I understand what you want to do. But of one thing
I am sure:

stdout is to write (OUT)
stdin is to read (IN)

In other words, you cannot read from stdout. stdout.gets will never
give you anything of interest.

Try to get your input from stdin, instead.

Carlo

okay, sorry; i wasnt very clear.

i have an autonomous process that writes to stdout (my example is called
child.rb). i want to fork this process and read the stdout of the child
process.

the problem is the process produces too much stdout that is fills up the
buffer. how can you read the stdout before the child process ends?

parent calls child
child write stdout
parent reads the stdout

here is my example to demo what i am trying to do: if you run parent.rb
you will see what i am trying to do.

#######################################

parent.rb

#######################################

require ‘open4’
require ‘./child.rb’

command = ‘child.rb’

#try it as return
pid, stdin, stdout, stderr = Open4::popen4(command)

while !stdout.eof? do
puts stdout.gets
end

###################################

########################

child.rb is

########################

(0…100).each do |i|
puts “helloworld #{i}”
sleep 1
puts “goodbye world #{i}”
sleep 1
end

#######################

On Thu, Nov 1, 2012 at 10:06 PM, Matthew P. [email protected]
wrote:

i have to fork a process and read a large amount of stdout; however, the
child buffer gets filled up and everything hangs. how do you read
stdout as stream?

the only way i know how to do this is the following two ways; but both
do not work; how can you get the stdout when the method starts so you
can read stdout as it get written to from the child process?

You can use a thread:

$ ruby x.rb
1
2
3
4
5
6
7
8
9
10
$ cat -n x.rb
1
2
3 require ‘open3’
4
5 Open3.popen3(“seq”, “1”, “10”) do |s_in, s_out, s_err, th|
6 # eat stderr
7 th_err = Thread.new { s = nil; while s_err.read(1024, s); end
}
8
9 # copy stdout
10 s_out.each_line {|line| puts line}
11
12 # wait until finished
13 th_err.join
14 end
15

Kind regards

robert