Loading program that uses gems on network share is incredibly slow.
Ruby’s need, due to Rubygems, to search a long list of directories for
a file to load due to a require can become very tedious on, for
example, an SMB share. Loading about a hundred files and having a
$LOAD_PATH with 20 directories in it equals an insane amount of
filesystem accesses. Taking network latency into consideration and
you’ve got unworkable delays at startup.
I was wondering if anyone has had a similar experience and, if so,
what they did to solve it.
One solution I thought of would be to delay as many of the requires as
possible, only requiring when the functionality is needed and perhaps
Another thought I had was to force Rubygems to look in the gem that
the require is coming from first, perhaps by inspecting
Kernel#caller. This would lead to a lot fewer directories being
searched, but would, it seems, require monkeypatching Rubygems.
Is there any way to treat Rubygems more like jars, so that a .gem
would remain packaged until loaded. This would greatly lessen the
network traffic and filesystem traversal, in my case.
I don’t want to debate the merits of storing files on a network
share. Deploying programs on an SMB share at work instead of on each
user’s computer saves me so much trouble.