Does anyone have experience with using Ruby for analysis (lots of
maths), on a machine with a ridiculous amount of RAM? For example, a
hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
Are there any “gotchas” I should be aware of? Would all the RAM be
addressable by a given Ruby process? Or would I still have to be
forking a number of processes, each allocated a bit of the address
space (blech)?
Does anyone have experience with using Ruby for analysis (lots of
maths), on a machine with a ridiculous amount of RAM? For example, a
hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
Are there any “gotchas” I should be aware of? Would all the RAM be
addressable by a given Ruby process? Or would I still have to be
forking a number of processes, each allocated a bit of the address
space (blech)?
Not having done this myself, you should take everything I say with a
grain of salt, but since ruby allocations (eventually) use malloc, how
much of this massive address space it gets and all that jazz strike be
as being something that is entirely up to the operating system.
(Excepting things in C extensions which may use mmap or whatever.)
Does anyone have experience with using Ruby for analysis (lots of
maths), on a machine with a ridiculous amount of RAM? For example, a
hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
i’ve had issues using mmap with files larger than 32gb - i’m not sure if
the
latest release has fixed this or not… in general you can run into
issues
with extenstions since ruby fixnums keep a bit to mark them as
objects…
Are there any “gotchas” I should be aware of? Would all the RAM be
addressable by a given Ruby process? Or would I still have to be forking a
number of processes, each allocated a bit of the address space (blech)?
assuming you have two or four cpus this might not be a bad idea - ipc is
so
dang easy with ruby it’s trivial to coordinate processes. i have a
slave
class i’ve used for this before:
Does anyone have experience with using Ruby for analysis (lots of
maths), on a machine with a ridiculous amount of RAM? For example, a
hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
Are there any “gotchas” I should be aware of? Would all the RAM be
addressable by a given Ruby process? Or would I still have to be
forking a number of processes, each allocated a bit of the address
space (blech)?
Thanks, oh Ruby masters.
Again, you can contact me off list for some ideas … without knowing
your goal, it’s difficult for me to know what steps you should take to
reach it.
Assume a properly working state-of-the-art 64-bit dual-core AMD or
Intel hardware platform with 64 GB of RAM and an appropriate SAN for
storage from a major vendor like IBM. That severely limits your OS
choices; last time I looked you need to be running either RHEL or SUSE
Enterprise Linux. I don’t know about the other vendors, but IBM has a
marvelous document on performance tuning humongous servers at
OK, now you’ve purchased a high-end server and a supported
enterprise-grade Linux, and you want to do some serious number crunching
on it, and you want to do it in Ruby, possibly augmented by libraries in
C, Fortran or assembler for speed. You will need to recompile everything – Ruby, the math libraries, and the compiler itself – to
use 64-bit addressing. There are some hacks and workarounds, but pretty
much this is required. If you end up with an Intel server, you might
want to have a look at the Intel compilers instead of GCC. Intel also
has some highly-tuned math libraries, as does AMD.
My point here is that you are “exploring” realms in Ruby that are
“usually” addressed using “more traditional” techniques, so you’re going
to need to do a fair amount of testing. That kind of server costs a lot
of money, and for that kind of money, you’ll get lots of support from
the vendor, coupled with strong incentives to do your job in ways that
are tested and proven to work and supported by said vendor.That may or
may not include Ruby, and if it does include Ruby, it may or may not
involve a small number of monolithic Ruby scripts directly addressing a
large address space.
There is a lot of help available on the Internet from people like me who
love challenges like this.
Ah, someone has done some of this! What compiler did you use to
recompile Ruby for 64-bit addressing? Did it work out of the box?
What’s the bottleneck in Ruby’s built-in IPC? Network traffic to
“localhost” and to the other hosts? System V IPC? Something else?
I haven’t really looked at the whole “lots of coordinated tiny
processes” thing in Ruby, since Erlang seems to have nailed that
approach and made it the core of the Erlang way to do things. I’m not a
big fan of re-inventing wheels; I’d much rather just get my numbers
crunched.
Does anyone have experience with using Ruby for analysis (lots of
maths), on a machine with a ridiculous amount of RAM? For example, a
hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
No…but, I’m always willing to help out! Just send me one such
workstation and I’ll send you the results post haste!
Regards,
Jordan
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.