Net::SSH performance question

I am using Net::SSH to execute scripts on a remote server. The script
that it executes takes a long time to complete, and dumps allot of
stuff to stdout. If i allow the script to run as is, this causes ruby
(on the local machine) to eat up massive amounts of cycles, as it
parses the stdout. Redirecting the stdout to /dev/null on the remote
box will fix the performance issue, but will cause me to lose my SSH
session due to inactivity.

Is there is a way to optimize the following code, (maybe having STDOUT
have a low buffer size)?
keep in mind:
A: i need ssh session to be persistent
B: i cannot modify the settings on the server
C: i really don’t want to rewrite the perl script ~2000 lines
D: i would prefer that the script does not move past the shell.perl
line until the script has completed on the remote box.

Net::SSH.start(‘foo’, :username=>‘bar’, :password=>’’) do |session|
copy_file(session, ‘/tmp/file’)
shell = session.shell.sync
shell.mv ("/tmp/file .")
shell.perl “test”
puts “done”
end

On 5/7/07, knohr [email protected] wrote:

keep in mind:
shell.mv ("/tmp/file .")
shell.perl “test”
puts “done”
end

Hmm there is one question I would have? What do you need as
result/output, if I understood correctly nothing?
In that case maybe not using shell should help, the following works
perfectly on my box

Net::SSH.start( ‘localhost’ ) do |session|
session.open_channel do | channel |
channel.on_data do | ch, data |
puts data
end
channel.exec “find /home/robert -exec ls -ltr {} \; >/dev/null”
end
session.loop
end

It waits until the channel is closed.

HTH
Robert

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs