Hello,
Just playing around with some .Net interop and the Decimal type. When
I do puts, I expected that this would have given me the underlying
number, or at least call to_string.
Instead I got the following behaviour, with the object type being
outputted.
@d = System::Decimal.MinValue
=> #System::Decimal:0x000005c
puts @d
#System::Decimal:0x000005e
=> nil
puts @d.to_string
-79228162514264337593543950335
=> nil
Is this by design?
Thanks
Ben
Also, how would I compare decimals in ruby, as Ruby doesn’t have a
decimal type built in?
I wanted to do something like
@d > ‘12312.123123’.to_d
However, that doesn’t appear to be possible. I attempt to do to_f,
but I got the following error:
ArgumentError: Object must be of type Decimal.
From mscorlib:0:in
CompareTo'E:\IronRuby\r156\src\IronRuby.Libraries\Extensions\IComparableOps.cs:24:in
<=>’
Any ideas?
Thanks
Ben
Originally, I had hacked in “decimal” to be our BigDecimal
representation. Now that we have a real BigDecimal (courtesy of Peter
Bacon D.), the CLR decimal type has become an orphan – it doesn’t
really have any direct support in IronRuby, so you just get the same
behavior that you’d get for any other random CLR object.
Let me pose this question to the list: which classes in the BCL (if any)
do you think should get the “Rubification” treatment?
Here’s some helper methods I’ve used for making System::Decimal a little
more palatable:
class Object
def to_clr_dec
System::Decimal.parse(self.to_s)
end
def to_clr_int
System::Int32.parse(self.to_s)
end
end
class System::Decimal
def + other
System::Decimal.Add(self, other)
end
def - other
System::Decimal.Subtract(self, other)
end
def * other
System::Decimal.Multiply(self, other)
end
def to_f
self.to_string.to_s.to_f
end
def inspect
to_f
end
end
FWIW I’d personally rather see “rubification” done this way, then in C#
libraries…