AR-extensions didn't speed things up -- why not?

I was excited to try AR-extensions (http://www.continuousthinking.com/
tags/arext) to do a bunch of data importing that had been really slow.
But then I was disappointed to see that it actually didn’t help and
wondered if I was missing something.

I’m finding that it actually goes faster to do

batch.each{|l| Quote.create(:date => l[0], :company_id => l
[1], :daily_return => l[2])}

than to do

Quote.import([:date, :company_id, :daily_return], batch)

In the script/server log when I do Quote.import I get an INSERT
statement for each line of the input, like this:

SQL (0.005698) INSERT INTO
“quotes” (“date”,“company_id”,“daily_return”) VALUES(‘20040803’,
3210,'0.002108
‘)
SQL (0.008398) INSERT INTO
“quotes” (“date”,“company_id”,“daily_return”) VALUES(‘20040804’,
3210,’-0.004628
')

and when I do a bunch of iterated #create calls I get statements like
this:

Quote Create (0.000774) INSERT INTO “quotes” (“daily_return”,
“price”, “date”, “company_id”, “abnormal_return”, “volume”) VALUES
(0.020922, NULL, ‘2008-11-28’, 428, NULL, NULL)
Quote Create (0.000689) INSERT INTO “quotes” (“daily_return”,
“price”, “date”, “company_id”, “abnormal_return”, “volume”) VALUES
(-0.026083, NULL, ‘2008-12-01’, 428, NULL, NULL)

So the import method is doing the same number of SQL INSERT
statements; here they happen to be faster although I’m not sure why
that is. I expected the import method to speed things up by combining
the commits into one statement or something.

Is this the expected behavior?

Thanks,
Andy

On Feb 25, 10:25 pm, andy [email protected] wrote:

than to do
SQL (0.008398) INSERT INTO
Quote Create (0.000689) INSERT INTO “quotes” (“daily_return”,
“price”, “date”, “company_id”, “abnormal_return”, “volume”) VALUES
(-0.026083, NULL, ‘2008-12-01’, 428, NULL, NULL)

So the import method is doing the same number of SQL INSERT
statements; here they happen to be faster although I’m not sure why
that is. I expected the import method to speed things up by combining
the commits into one statement or something.

Is this the expected behavior?

What database are you using?

On Mar 4, 11:32 am, Andy E. [email protected] wrote:

Thanks for getting back to me, Zach.

I’m using sqlite3. I had seen somewhere that ar-extensions worked okay
with sqlite. Was I wrong?

It works fine with sqlite, but AFAIK sqlite doesn’t provide a
mechanism for multiple inserts, so it defaults back to one record per
insert.

If you know of a way to mold sqlite for efficient importing of data I
am all ears.

Zach

Thanks for getting back to me, Zach.

I’m using sqlite3. I had seen somewhere that ar-extensions worked okay
with sqlite. Was I wrong?

Andy

Well, that solves it. And no I don’t know how to do more efficient
importing of data with sqlite.

Thanks,
Andy

On Wed, Mar 4, 2009 at 10:56 AM, zdennis [email protected] wrote:

If you know of a way to mold sqlite for efficient importing of data I
am all ears.

Wrap your inserts in a transaction.


Greg D.
http://destiney.com/

If you’re inserting a lot of rows, it can also be helpful to wrap the
whole operation in an ActiveRecord::Base#benchmark call, like so:

Quote.benchmark(“Inserting stuff”) do

insert stuff

end

This makes your log file more readable, and saves a lot of time (not
generating all the log entries).

–Matt J.

Greg D. wrote:

On Wed, Mar 4, 2009 at 10:56 AM, zdennis [email protected] wrote:

If you know of a way to mold sqlite for efficient importing of data I
am all ears.

Wrap your inserts in a transaction.


Greg D.
http://destiney.com/

Reference for future readers – a transaction does the trick. An order
of magnitude faster.

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs