Hello,
Fair warning…I’m extremely new to Ruby so please pardon any
ignorance on my part.
Anyway, so the first thing I start to play around with after
reading through Programming Ruby is the socket library. Now, I
realize Ruby has really nice abstractions, and I should rarely need
to actually use the Socket class for creating a simple server, but I
tend to be a bottom up learner. So, I was attempting to run the
following code from the ruby-doc website:
In one script, start this first
require 'socket'
include Socket::Constants
socket = Socket.new( AF_INET, SOCK_STREAM, 0 )
sockaddr = Socket.pack_sockaddr_in( 2200, 'localhost' )
socket.bind( sockaddr )
socket.listen( 5 )
client, client_sockaddr = socket.accept
puts "The client said, '#{socket.readline.chomp}'"
client.puts "Hello from script one!"
socket.close
now, after running this script on both ruby 1.8.4 and 1.8.5 on Mac OS
X I received an error "`bind’: Invalid argument - bind(2)
(Errno::EINVAL) " I proceed to run the script on linux on 1.8.4
and 1.8.5 and discover that bind works, but that
“socket.readline.chomp” should really be “client.readline.chomp”.
Anyway, the point is that the code runs just fine on linux but not OS
X. A quick investigation finds that the problem is the call
“Socket.pack_sockaddr_in( 2200, ‘localhost’ )”. On Linux this call
always returns a string that is 16 characters long. On Mac OS X this
call returns a string that is 16 characters long so long as I don’t
use ‘localhost’ for the hostname. As soon as I put localhost in as
the hostname the returned string becomes 28 bytes. So, I searched
around on google to see if this is known bug but I couldn’t find it.
Anyone have any words of advice?
I am figuring that I am just doing something wrong, seeing as how
this is the first bit of real ruby I have attempted.
Thanks,
Patrick