Question about Ruby philosophy

Hello, when I compare Ruby to Java there is something I don’t understand
.

For example, let say that I create instances of JDBCFooBaseDriver in
Java - The methods and their behaviors will never change, unless I
update JDBCFooBaseDriver jar file. I can be sure that this part of my
environment will stay the same even if new third party libraries are
added and used by my application.

Now, in Ruby I can use a new library (FancyDates) that will alter the
behavior of some JDBCFooBaseDriver methods - Some methods of
JDBCFooBaseDriver will be redefined by FancyDates and new methods will
be added - All this stuff will occurs without giving me a chance to be
informed of what is going on under the scene.

So,

  1. Is what I say right ?
  2. Why should I not be scared by that ?
  3. Why most C#, Java, C++ developper thinks that this approach is
    dangerous and lead to bad practices ?

Thanks

If this question have been answered many times, feel free to only
provide some links to references.

As a Java developer still learning the Ruby way, here’s my
understanding:

  1. Yes
  2. If you’re forced to integrate with code from bad programmers, maybe
    you should be worried (maybe some Rubyists will provide some way this
    can be mitigated…). If not then:
  3. The Ruby philosophy is that it’s better to give you, the developer,
    the power to do things like this, even if you COULD should yourself in
    the foot; Ruby trusts you to be smart and not make changes that will
    break things.

I guess the only way to protect to yourself is to have automated
functional tests that provide you with comfort that everything works
even with all the code loaded that may make these kinds of
alterations.

-Stephen

Zouplaz wrote:

JDBCFooBaseDriver will be redefined by FancyDates and new methods will
be added - All this stuff will occurs without giving me a chance to be
informed of what is going on under the scene.

So,

  1. Is what I say right ?
  2. Why should I not be scared by that ?
  3. Why most C#, Java, C++ developper thinks that this approach is
    dangerous and lead to bad practices ?

You are correct and your worry is not uncommon. And yes, it does seem
scary. But in practice it turns out not to be such a problem. Most
C/Java programmers don’t know this simply b/c they can’t do it. With a
bit of good sense, open classes can be a great productivity booster.
The reason for this are suprisingly simple. As with any library you
dont use it unless a) you need it and b) you know what it does. Combine
that with unit testing and there’s no need to be so worried.

Also, it’s generally accepted practice not to alter pre-existing
methods, and instead add new ones if you need additional functionality
(though there are execptions of course).

T.

Stephen D. wrote:

As a Java developer still learning the Ruby way, here’s my understanding:

  1. Yes
  2. If you’re forced to integrate with code from bad programmers, maybe
    you should be worried (maybe some Rubyists will provide some way this
    can be mitigated…). If not then:
  3. The Ruby philosophy is that it’s better to give you, the developer,
    the power to do things like this, even if you COULD should yourself in
    the foot; Ruby trusts you to be smart and not make changes that will
    break things.

A typo I know, but that’s a perfect way to put it: You can “should
yourself in the foot” with such worries :slight_smile:

T.

On 12/4/06, Trans [email protected] wrote:

added and used by my application.
2) Why should I not be scared by that ?
3) Why most C#, Java, C++ developper thinks that this approach is
dangerous and lead to bad practices ?

You are correct and your worry is not uncommon. And yes, it does seem
scary.

It is scary unless you have a very good testing environment, but that
is
not the most important point I want to make

look at Object#freeze, maybe you feel better now ;), I am not sure if it
can
be circumvented but it is better than nothing.

HTH
Robert

But in practice it turns out not to be such a problem. Most

C/Java programmers don’t know this simply b/c they can’t do it. With a
bit of good sense, open classes can be a great productivity booster.
The reason for this are suprisingly simple. As with any library you
dont use it unless a) you need it and b) you know what it does. Combine
that with unit testing and there’s no need to be so worried.

Yup sorry, you said it before.

Also, it’s generally accepted practice not to alter pre-existing

methods, and instead add new ones if you need additional functionality
(though there are execptions of course).

T.

R.


“The real romance is out ahead and yet to come. The computer revolution
hasn’t started yet. Don’t be misled by the enormous flow of money into
bad
defacto standards for unsophisticated buyers using poor adaptations of
incomplete ideas.”

  • Alan Kay

On 04.12.2006 13:04, Zouplaz wrote:

Others have given good answers that I cannot answer better. However:

  1. Why most C#, Java, C++ developper thinks that this approach is
    dangerous and lead to bad practices ?

Are you sure this is true? True, people frequently are scared of Ruby’s
dynamic nature - or criticize it. But I would not go as far as to claim
that the majority of C# / Java developers thinks this approach is
dangerous.

If this question have been answered many times, feel free to only
provide some links to references.

You will find discussions of these issues in threads dealing with
“dynamic typing” - there’s a ton of them. Also, IIRC there are
discussions of these issues that revolve around “rational” which used to
change the behavior of Fixnum#/ which could cause problems for some
programs.

Kind regards

robert

On 12/4/06, Zouplaz [email protected] wrote:

JDBCFooBaseDriver will be redefined by FancyDates and new methods will
be added - All this stuff will occurs without giving me a chance to be
informed of what is going on under the scene.

Not really. Presumably if you’re using a third-party library, you
have a clue as to why you’re using it. You’ve probably looked at the
documentation, you definitely have access to the source. Not saying
you should need to examine the source to use a lib, but basically you
should at least read the documentation and stuff before blindly
throwing it into a project.

So,

  1. Is what I say right ?
  2. Why should I not be scared by that ?
  3. Why most C#, Java, C++ developper thinks that this approach is
    dangerous and lead to bad practices ?

There’s potential for things to go wrong, I guess. Just like there’s
potential for mistakes in dynamic typing. Those opposed to dynamic
typing will say stuff like, “Well what if someone calls
refigerator.meow? It blows up at run time!” Those kinds of questions
basically boil down to “What if I’m stupid?” Well, can’t help you
there…

Basically, just give Ruby a shot for a while. Use it in a real
project. If it doesn’t work for you, drop it.

Pat

On 12/4/06, Zouplaz [email protected] wrote:

JDBCFooBaseDriver will be redefined by FancyDates and new methods will
be added - All this stuff will occurs without giving me a chance to be
informed of what is going on under the scene.

So,

  1. Is what I say right ?
  2. Why should I not be scared by that ?
  3. Why most C#, Java, C++ developper thinks that this approach is
    dangerous and lead to bad practices ?

I can only speak from a Java perspective here, but if you look at what
most frameworks and (shudder) containers out there do, you’ll find
there is an enormous amount of reflection going on. The more modern
frameworks also make use of runtime bytecode manipulation. Both of
these techniques introduce the same type of dynamics into a project
(and negate the “advantages” of static typing), but - unlike Ruby - it
is almost impossible to understand what is going on under the hood.
And every framework does things in a different way. At least with
Ruby, you can recognise this kind of metaprogramming easily, and if
used in libraries its use is normally well documented.

Ruby programmers are simply more aware of these features and are
therefore able to use them effectively.

Cheers,
Max

On 12/5/06, Robert K. [email protected] wrote:

“dynamic typing” - there’s a ton of them. Also, IIRC there are
discussions of these issues that revolve around “rational” which used to
change the behavior of Fixnum#/ which could cause problems for some
programs.

First I’ve heard of the ‘used to’! When did that change?

martin

Robert K. wrote:

dangerous.

Aaactually, yes, they/we do. The Ruby way presumes reasonably skilled
coders, not going overboard on clobbering , sufficient documentation,
and test coverage. After having seen my odd share of undocumented,
untested messes (usually cobbled together by generations of interns)
that are barely maintainable -with- the security blanket of type
checks, tool support, and no possibility of out-of-sight clobbering
changes, I pray /those people/ don’t discover Ruby with me having to set
eyes on their code.

(I’m not accepting that the blame be laid on the language, the problems
are usually plain newbiedom, laziness, or copy-pasting snippets of “what
everyone before used and worked” instead of cleaning up the design. I’ve
seen it with Java, I’ve seen it with PHP, I have very little doubts I’ll
eventually see it with Ruby when it reaches the popularity / hype
threshold. Some of that approach I already notice in some of the newbie
posts here, though in yet negligible amounts.)

Compared with Ruby having a low barrier to entry, followed by a (IMO)
steep learning curve to mastery, and the tendency of people in their
intern phase to be infatuated by shiny sparkly objects (like, oh, witty
eval meta hacks), this scares the heck out of me. (Hopefully, the
harmful features are too advanced for the pointy-haired-boss-acting-hip
sort of programmers.)

Basically, the messiest Java codebase I can imagine isn’t nearly as
scary as the messiest Ruby one. FUD? Maybe. Personal experience?
Certainly - code I have to tackle would invariably be much worse if it
abused the wordt of Ruby. But as usual, just a language comparison
without context doesn’t really say anything, and the above quoted point
3 is an overgeneralising blanket statement; i.e. more often than not,
patent nonsense, and at least in this case, horribly phrased. Lack Of A
Clue ™ is what leads to bad practices, the important question is “How
bad practices can you have?”. Or scratch that, the only practically
relevant question is “How bad practices DO you have right now?”,
anything more general tends to speculation, and in programming
discussions, speculations leads to a vapid cloudfest really fast.

Provided the assumptions mentioned hold for a project during its
lifetime, the dangers of the approach are a non-issue. Except they don’t
necessarily, and then the security blanket helps a lot - not everyone
can afford to wrinkle his nose in disgust at ugly code and quit on a
whim. (Need just a little more time in the global Fortune 100 entries
spearheading the CV before I can consider moving on from “grin and bear
the soul-draining horror 75% of the time, read bash.org for the rest of
it” employment to being more picky with regards to personally rewarding
experiences.) Ruby’s critical point of mass adoption is yet to come, and
if the blogosphere noise is even remotely representative, I’m halfways
sure it -will- come (the mindshare Ruby gets despite not having a single
notable sugar daddy is rather spectacular), and I expect Interesting
Things ™ to happen if/when it does. Hopefully, of the good sort (job
openings, development funding), but the opposite (widespread
language-altering libraries becoming a tangled mess that requires effort
to get working in a nontrivial project) isn’t entirely impossible.

David V.
Sleep Deprived

The bottom line seems to be that the current crop of Ruby programmers
are responsible enough not to mess with things under the hood unless
they understand the workings.

Is this an accurate assessment?

What scares me a bit is when Ruby becomes extremely popular and every
second programmer is using it, and less-disciplined developers start
providing libraries that tinker with core classes willy-nilly. At some
point you are going to get clashes between two same-named add-on methods
in (say) Array, that do different things . That could be fun to
debug.

Sure, you can decide not to use such a library, but you have to get
burned first before you realize it’s not good to use. I’m probably
showing my age here, but I remember cases where people used to code
macros in C header files (like MIN and MAX) that would clash with other
uses of MIN and MAX and cause untold havoc. Some of these were from
mainstream companies.

It’s one thing to say “What if I’m stupid?”. It’s quite another thing to
say, “What if lots of stupid/undisciplined people start using Ruby?”
This tends to be the price of language popularity, I fear.

Probably, evolutionary forces will get rid of the poorly designed
libraries in time. I’m more worried about the beleaguered corporate IT
developer who often does not have much choice and has to use in-house
code.

I dunno. Thoughts?

On Dec 4, 2006, at 6:42 PM, Edwin F. wrote:

What scares me a bit is when Ruby becomes extremely popular and every
second programmer is using it, and less-disciplined developers start
providing libraries that tinker with core classes willy-nilly. At some
point you are going to get clashes between two same-named add-on
methods
in (say) Array, that do different things . That could be
fun to
debug.

Perl has had the same issue for a long time now and it probably was
used by “every second programmer” in its prime. It works when people
stay conscious of what they are doing and remember to document.
Other times it degrades into an unusable mess and we hope natural
selection will kill off those libraries.

I’m with Ruby’s natural tendency on this issue: trust the programmer.

James Edward G. II

On Tue, 5 Dec 2006, Edwin F. wrote:

in (say) Array, that do different things . That could be fun to
debug.

Sure, you can decide not to use such a library, but you have to get
burned first before you realize it’s not good to use. I’m probably
showing my age here, but I remember cases where people used to code
macros in C header files (like MIN and MAX) that would clash with other
uses of MIN and MAX and cause untold havoc. Some of these were from
mainstream companies.

harp:~ > irb
irb(main):001:0> Array.freeze
=> Array

irb(main):002:0> class << Array; def each() ‘ha-ha’; end; end
TypeError: can’t modify frozen object
from (irb):4
from :0

It’s one thing to say “What if I’m stupid?”. It’s quite another thing to
say, “What if lots of stupid/undisciplined people start using Ruby?”
This tends to be the price of language popularity, I fear.

Probably, evolutionary forces will get rid of the poorly designed
libraries in time. I’m more worried about the beleaguered corporate IT
developer who often does not have much choice and has to use in-house
code.

I dunno. Thoughts?

what if people start coding in c and leave dangling pointers lying
around,
double free pointers, corrupt the heap in their lib, forget the clean-up
resources in at_exit handlers, or don’t prefix each and every
var/function/macro with something like my_lib_XXX?

where would we be? :wink:

i can hear people thinking ‘java’ out there already - but those guys are
manipulating byte code to subvert their fist-cuffs already! anyone know
a
thing or two about boost::any? on a related note, it seems the most
useful
ocaml code uses the type system in a way that makes the promise of
‘safe’
programs more difficult or impossible for the compiler to ensure…

history has shown that there is exactly one re-usable component of
code:
the shared library. at least in the *nix world, nearly all of them are
written
in c and it is plagued by issues at least two orders of magnitude, imho,
worse
than clobbering Array#each! yet, the internet continues to be powered
by *nix
servers running said c libraries :wink:

(ducking)

-a

Edwin F. wrote:

The bottom line seems to be that the current crop of Ruby programmers
are responsible enough not to mess with things under the hood unless
they understand the workings.

Ruby: You’ll Shoot Your Eye Out!

http://www.cafepress.com/rubyshootout.46707105

(Yes, a shameless plug. But hey; Christmas is coming! )


James B.

“Blanket statements are over-rated”

Max M. wrote:

I can only speak from a Java perspective here, but if you look at what
most frameworks and (shudder) containers out there do, you’ll find
there is an enormous amount of reflection going on. The more modern
frameworks also make use of runtime bytecode manipulation.

Though to me it seems like runtime bytecode manipulation is still rather
rare, I see load-time enhancement much more often. While still magical,
changes made in the scope of a given class don’t directly propagate into
other classes. If you can mention a counterexample though, do so so I
know what to avoid.

Both of
these techniques introduce the same type of dynamics into a project
(and negate the “advantages” of static typing)

This is not completely analogous. Reflective method calls still fail
early on type errors. I would prefer if path expressions and their host
documents were precompiled if possible instead of handled reflectively
though. JSP + JSTL + EL must be the single worst combination of
technologies to debug and generally maintain.

, but - unlike Ruby - it
is almost impossible to understand what is going on under the hood.

Depends on the definition of “understand”. If you mean trace the
(possible) inner workings in your head, then you’re right; however,
usually just knowing the actions and corresponding results is enough to
use the code. Which is another reason why learning programming languages
is a valuable hobby - if you’ve seen a mechanism in one language (and
understood it because that allowed for a clear executable notation), you
don’t have much problems using effectively the same thing in another,
even if the implementation jumps through hoops. You just don’t need to
care as long as it works.

And every framework does things in a different way. At least with
Ruby, you can recognise this kind of metaprogramming easily, and if
used in libraries its use is normally well documented.

In Ruby, metaprogramming alterations aren’t nearly perfectly consistent
between libraries, only the low-level implementation methods which you
usually don’t need to care about are the common denominator. Valuable as
a learning resourse, I maintain that it’s not quite relevant in
practice.

As for recognition, I don’t think it’s too language-specific once you’re
familiar with the high-level concept being realised; same for the
documentation.

Ruby programmers are simply more aware of these features and are
therefore able to use them effectively.

This is a double-edged sword. Java runs less risk of the features being
used maliciously because of the programmers being unaware. I’m not
stating this is a linearly better state of affairs, it just happens to
result in Java metaprogramming being contained to a select few
frameworks, where it’s less likely destructive conflicts will occur.
It’s still possible, and might yet show up as an issue, just not in the
near future.

However, greater skill in metaprogramming does NOT follow from greater
awareness. You can be able to recite five ways of dynamically definining
a method from heart, and still mess up someone’s number-munging script
because you absolutely had to require mathn instead of just using the
other operators.

David V.

David V. wrote:

Robert K. wrote:

dangerous.

Aaactually, yes, they/we do. The Ruby way presumes reasonably skilled
coders, not going overboard on clobbering , sufficient documentation,
and test coverage. After having seen my odd share of undocumented,
untested messes (usually cobbled together by generations of interns)
that are barely maintainable -with- the security blanket of type
checks, tool support, and no possibility of out-of-sight clobbering
changes, I pray /those people/ don’t discover Ruby with me having to set
eyes on their code.

I'm with David V. on this.

I’ve had to deal with a vast number of “messes” in too many languages to
mention, so this isn’t a Ruby-specific phenomenon. Ruby simply adds one
more unique way to shoot yourself in the foot (I refer specifically to
the open classes). Speaking of shooting feet, if you haven’t seen it,
there is an old but hilarious list of ways to shoot yourself in the foor
with :

http://www-users.cs.york.ac.uk/susan/joke/foot.htm

I don’t know if Ruby has an entry for this somewhere on the Internet.
How about a Ruby Q. for the funniest way to shoot yourself in the foot
with Ruby?

James G. wrote:

I’m with Ruby’s natural tendency on this issue: trust the programmer.

Sorry, but I don’t think it’s enough to trust the programmer - at least,
most of them. For every thoughtful, careful, circumspect programmer out
there, there are 100 or more clueless, undisciplined, lazy, and/or just
plain stupid programmers (of course none of the people on this forum
fall into this category :slight_smile:

I’m not Ruby-bashing. I think Ruby is great, and consider it in general
to be the best language I have worked with; it’s certainly the most
enjoyable to use.

I just don’t look forward to having to debug some horribly-written Ruby
code that is doing something bizarre because someone thought they were
clever and added something inadvisable to some core class somewhere. I
have seen this kind of thing way too often (I have seen, for example,
standard C library header files in /usr/include modified to make
something work in a project, then the whole application stop working
when the header file was overwritten when the OS (I think it was AIX)
got a patch kit.)

unknown wrote:

On Tue, 5 Dec 2006, Edwin F. wrote:

in (say) Array, that do different things . That could be fun to
debug.

Sure, you can decide not to use such a library, but you have to get
burned first before you realize it’s not good to use. I’m probably
showing my age here, but I remember cases where people used to code
macros in C header files (like MIN and MAX) that would clash with other
uses of MIN and MAX and cause untold havoc. Some of these were from
mainstream companies.

harp:~ > irb
irb(main):001:0> Array.freeze
=> Array

irb(main):002:0> class << Array; def each() ‘ha-ha’; end; end
TypeError: can’t modify frozen object
from (irb):4
from :0

It’s one thing to say “What if I’m stupid?”. It’s quite another thing to
say, “What if lots of stupid/undisciplined people start using Ruby?”
This tends to be the price of language popularity, I fear.

Probably, evolutionary forces will get rid of the poorly designed
libraries in time. I’m more worried about the beleaguered corporate IT
developer who often does not have much choice and has to use in-house
code.

I dunno. Thoughts?

what if people start coding in c and leave dangling pointers lying
around,
double free pointers, corrupt the heap in their lib, forget the clean-up
resources in at_exit handlers, or don’t prefix each and every
var/function/macro with something like my_lib_XXX?

where would we be? :wink:

i can hear people thinking ‘java’ out there already - but those guys are
manipulating byte code to subvert their fist-cuffs already! anyone know
a
thing or two about boost::any? on a related note, it seems the most
useful
ocaml code uses the type system in a way that makes the promise of
‘safe’
programs more difficult or impossible for the compiler to ensure…

history has shown that there is exactly one re-usable component of
code:
the shared library. at least in the *nix world, nearly all of them are
written
in c and it is plagued by issues at least two orders of magnitude, imho,
worse
than clobbering Array#each! yet, the internet continues to be powered
by *nix
servers running said c libraries :wink:

(ducking)

-a

Ah, but here’s the difference. You have to know a fair amount of C to
even build a shared library, using malloc() and friends for memory
(mis)management. You have to compile and link the code.

It’s a lot more work than writing
class Array
def clobbersomething
end
end

In Java, it’s not trivial to manipulate bytecode. It’s usually done by
experts, although there are frameworks out there that make it easier.

It’s a lot more work than writing
class Array
def clobbersomething
end
end

You are not going to get rank beginners to use Boost::anything. It
doesn’t even compile on all major compilers (e.g. Only the most recent
versions of HPUX aCC support partial template specialization), and I
defy any average intern to decipher C++ compile-time error messages
relating to template problems.

It sure is a lot more work than writing
class Array
def clobbersomething
end
end

Please, I am not Ruby-bashing. I know there are far worse languages out
there from the point of view of shooting yourself in the foot (I’ve used
enough of them that I’m lucky even to have any feet left :). Any
language that provides a lot of power can be misused. Ruby simply makes
this one thing so easy to do, with such potentially dire consequences.
If anything, I’d just want an optional way (besides .freeze) to control
what can and can’t be extended at run-time, to prevent accidental,
ill-advised, or even malicious tampering, or to help identify it). Hey,
maybe there’s a way to do it already?

On Tue, 5 Dec 2006, Edwin F. wrote:

Ah, but here’s the difference. You have to know a fair amount of C to
even build a shared library, using malloc() and friends for memory
(mis)management. You have to compile and link the code.

It’s a lot more work than writing
class Array
def clobbersomething
end
end

gcc -shared a.c

??

not too hard. but i understand your point. still, i disagree that the
people
releasing rubygems are of any lower quality than the people releasing
shared
libraries. remember, if the code is not available the question is
rather moot
and, in any language, you’ll notice only 1% of coders releasing
libraries.
this works in our favour with respect to robustness.

In Java, it’s not trivial to manipulate bytecode. It’s usually done by
experts, although there are frameworks out there that make it easier.
It’s a lot more work than writing

class Array
def clobbersomething
end
end

the same can be said of meta-programming and manipulating built-ins in
ruby.
i’ve spoken on both subjects a few times and have many libraries out
there
that do some of each or both - my perception has never been that the
even
above average ruby hacker is doing tons of either, especially without
thinking
about it.

You are not going to get rank beginners to use Boost::anything. It
doesn’t even compile on all major compilers (e.g. Only the most recent
versions of HPUX aCC support partial template specialization), and I
defy any average intern to decipher C++ compile-time error messages
relating to template problems.

no argument there. the point though, was that boost and many other
powerful
tools do indeed subvert the safety systems of their respective
languages. as
in ruby, great power is dangerous.

It sure is a lot more work than writing
class Array
def clobbersomething
end
end

on thing to consider, however, is also how easy it would be to debug
such an
error. it’d literally be

Array.freeze
require ‘clobbersomething.rb’ #=> beautiful stack-trace

this is no small point. as someone who doesn’t even write c code
without
firing up gcc i can assure you that, in 6 years of full-time ruby
hacking,
i’ve never pulled those week long
sinking-feeling-in-the-pit-of-your-stomach-marathon-gdb-sessions i used
to
routinely pull when i worked with c-- and c more often!

we’ve all corrupted the heap before… :wink:

Please, I am not Ruby-bashing. I know there are far worse languages out
there from the point of view of shooting yourself in the foot (I’ve used
enough of them that I’m lucky even to have any feet left :). Any language
that provides a lot of power can be misused. Ruby simply makes this one
thing so easy to do, with such potentially dire consequences. If anything,
I’d just want an optional way (besides .freeze) to control what can and
can’t be extended at run-time, to prevent accidental, ill-advised, or even
malicious tampering, or to help identify it). Hey, maybe there’s a way to do
it already?

i think everyone is on the same page with you here, and you comments are
certainly well taken. i’ll leave to others to comment on potential
solutions.

kind regards.

-a

On 12/4/06, Edwin F. [email protected] wrote:

James G. wrote:

I’m with Ruby’s natural tendency on this issue: trust the programmer.

Sorry, but I don’t think it’s enough to trust the programmer - at least,
most of them. For every thoughtful, careful, circumspect programmer out
there, there are 100 or more clueless, undisciplined, lazy, and/or just
plain stupid programmers (of course none of the people on this forum
fall into this category :slight_smile:

You either trust the programmer or you don’t. There’s no in between,
no way to trust some programmers. Last I checked, no compiler or
interpretter has an IQ test during the installation.

If you think the computer is smarter than you or your coworkers, than
Ruby isn’t for you. If you’re smarter than the computer, then it’s
probably a good match.

Pat

Such shameless self promotion… Go to

as penance :wink:

Fred
(well… Guilty as charged :wink: )