I’m fairly new to Ruby and don’t do much programming, but when I saw
this example I was surprised that the default behavior is that 3.2 -
2.0 != 1.2
To me, this violates the “Principal of least surprise”,
Do you mean the “principle of least suprise”? One word means primary,
other means standard or canon. They sound the same, but that’s all they
have in common.
but I guess it
isn’t a big deal because I don’t remember it being discussed in
Programming Ruby book (but it certainly may have been).
Do other languages work this way?
All of them that use binary internal storage, yes. A decimal value like
is a continuing fraction in binary and cannot be precisely represented
binary in a finite number of places. Sort of like 1/3 in either binary
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.