Forum: Ruby Ruby performance

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
7e6ee7bc26cac7c3f1edce27558ced3d?d=identicon&s=25 Keith Sader (Guest)
on 2006-03-22 14:51
(Received via mailing list)
I'm considering using Ruby to re-write and extract, transform, and
load process for an online database.  This will replace an existing VB
system that does most of the T of the ETL in T-SQL(don't ask).

My replacement choices come down to .Net(C# or VB), Perl, and Ruby.
Since we can have up to 3 million updates to the database during the
day, performance is an issue.  Does Ruby perform as well at large text
file transformations as Perl?  Does C# for that matter?

At this point my gut feeling is to write the ETL in Ruby and transform
it to Perl if performance becomes an issue.

Any thoughts?

thanks,
Bb6ecee0238ef2461bef3416722b35c5?d=identicon&s=25 pat eyler (Guest)
on 2006-03-22 15:06
(Received via mailing list)
On 3/22/06, Keith Sader <ksader@gmail.com> wrote:
> it to Perl if performance becomes an issue.
Write it in Ruby, then if you have performance issues -- profile,
benchmark, and optimize.   If you really get stuck RubyInline will
be your friend *far* more than Perl ever would.
722a18819725c0f6275b556ced89a3f4?d=identicon&s=25 Sascha Ebach (Guest)
on 2006-03-22 15:09
(Received via mailing list)
> Write it in Ruby, then if you have performance issues -- profile,
> benchmark, and optimize.   If you really get stuck RubyInline will
> be your friend *far* more than Perl ever would.

Unless he uses Perl::Inline :)

(Not advocating for Perl here, just remembered where RubyInline comes
from)

-Sascha Ebach
Bb6ecee0238ef2461bef3416722b35c5?d=identicon&s=25 pat eyler (Guest)
on 2006-03-22 15:19
(Received via mailing list)
On 3/22/06, Sascha Ebach <se@digitale-wertschoepfung.de> wrote:
> > Write it in Ruby, then if you have performance issues -- profile,
> > benchmark, and optimize.   If you really get stuck RubyInline will
> > be your friend *far* more than Perl ever would.
>
> Unless he uses Perl::Inline :)

You'd still have to use Perl  ;^)

>
> (Not advocating for Perl here, just remembered where RubyInline comes from)
>

Well, zenspider's implementation is cool enough that it got Ingy to
look twice (and, iirc, to pull some ideas into the Perl version).
5befe95e6648daec3dd5728cd36602d0?d=identicon&s=25 Robert Klemme (Guest)
on 2006-03-22 15:28
(Received via mailing list)
Keith Sader wrote:
> it to Perl if performance becomes an issue.
>
> Any thoughts?

Usually client tool performance is shadowed by DB performance and
network communication times.  Due to the nature of the DB it has to do
more complex IO operations to store the data than the client which reads
just a plain text file sequentially or writes it sequentially.  I'd go
with Ruby.

Kind regards

	robert
F5b3c1ebfb2e9fc5f67bb48b119f6054?d=identicon&s=25 Randy Kramer (Guest)
on 2006-03-22 20:26
(Received via mailing list)
Keith Sader wrote:
> Any thoughts?

Of course ;-)

3 million updates per day doesn't mean much to me.  With a little
arithmetic,
that looks like a sustained average load of ~ 33 TPS.

Having said that, I don't know if that's reasonably within Ruby's
capabilities
or not--I just think TPS (transactions per second) is a more common
metric
for performance of databased applications.  (But, I could be wrong ;-)

As a Ruby newbie, unless it's grossly out of the ballpark, I'd probably
try to
build it in Ruby first, then optimize, then consider upgrading hardware,
and
then consider switching to another language.

Randy Kramer
B6e200953f671dd7d33380b8e507b796?d=identicon&s=25 Eric Kidd (Guest)
on 2006-03-22 20:50
(Received via mailing list)
On Mar 22, 2006, at 2:26 PM, Randy Kramer wrote:
> 3 million updates per day doesn't mean much to me.  With a little
> arithmetic,
> that looks like a sustained average load of ~ 33 TPS.

By an interesting coincidence, Rails sites tend to support about 30
hits/second/server on decent hardware, assuming they have to go all
the way to the database and render views. With action caching (which
bypasses the database and view rendering, but still runs Ruby code),
I've seen benchmarks in the 500 hits/second range.

So Ruby might very well be a plausible solution, depending on a
number of factors. Given the sweet simplicity of ActiveRecord, you
could even spend a couple of days building a prototype and seeing how
fast it goes. :-)

Cheers,
Eric
91e1fb8bd265b7629491ab64c42f0906?d=identicon&s=25 Reid Thompson (Guest)
on 2006-03-22 22:57
(Received via mailing list)
Eric Kidd wrote:
>
> So Ruby might very well be a plausible solution, depending on a number
> of factors. Given the sweet simplicity of ActiveRecord, you could even
> spend a couple of days building a prototype and seeing how fast it
> goes. :-)
>
> Cheers,
> Eric
>
>
>
Simple test, ran from
   within RDE editor
   windows xp,
   testog=# select version();

version
------------------------------------------------------------------------------------------
 PostgreSQL 8.1.3 on i686-pc-mingw32, compiled by GCC gcc.exe (GCC)
3.4.2 (mingw-special)
(1 row)
   with 119 odd processes running ( windows, cygwin, etc, etc)
   1GB ram

10000 inserts in < 60 seconds = 167 tps.

testog=# truncate ogcomment;
TRUNCATE TABLE
testog=# select * from ogcomment;
 title | body | author | create_time | oid
-------+------+--------+-------------+-----
(0 rows)

I, [2006-03-22T16:44:48.183000 #3220]  INFO -- : Og uses the Psql store.
Wed Mar 22 16:44:49 Eastern Standard Time 2006
Wed Mar 22 16:45:44 Eastern Standard Time 2006
D, [2006-03-22T16:44:49.464000 #3220] DEBUG -- : Table ogcomment already
exists
D, [2006-03-22T16:44:49.495000 #3220] DEBUG -- : PostgreSQL processing
foreign key constraints
D, [2006-03-22T16:44:49.495000 #3220] DEBUG -- : PostgreSQL finished
setting constraints. No action was taken in 0.00 seconds.
Completed(0)



testog=# select count(*) from ogcomment;
 count
-------
 10000
(1 row)

testog=# select min(create_time),max(create_time) from ogcomment;
         min         |         max
---------------------+---------------------
 2006-03-22 16:44:49 | 2006-03-22 16:45:44
(1 row)



require 'og'

class Comment
    property :title, String
    property :body, String
    property :author, String
    property :create_time, Time
end
og_psql = {
  :destroy => true,
  :store => :psql,
  :user => 'rthompso',
  :password => 'rthompso',
  :name => 'testog'
}

Og.setup(og_psql)

c = Comment.new
c.title = 'Hello'
c.body = 'World'
c.create_time = Time.now
c.author = 'tml'

puts Time.now
# save the object in the database
1.upto(10000)  { |i|
  c = Comment.new
  c.title = 'Hello'
  c.body = 'World'
  c.create_time = Time.now
  c.author = 'tml'
  c.save
}
puts Time.now
This topic is locked and can not be replied to.