Slow ActiveRecord - MySQL

I use find_by_sql to execute a query which takes around 120 seconds to
execute in Ruby. When I execute the same request using SQLyog (GUI),
the requests takes around 15 seconds to execute.

Here’s roughly what the query looks like:

SELECT DAY(date_time) “day”, COUNT( * ) “count”
FROM table
WHERE (id = some_id)
AND(YEAR(date_time) = 2006)
AND(MONTH(date_time) = 8)
AND(some_string IN (‘some short string’) )
GROUP BY DAY( date_time );

I’m querying about a million records.

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Heist wrote:

AND(MONTH(date_time) = 8)
AND(some_string IN (‘some short string’) )
GROUP BY DAY( date_time );

I’m querying about a million records.

Do you need all of the million records in memory at once? You’re getting
overhead of ActiveRecord creating one million records and
keeping them in memory, and your program is probably ending up using
swap which is forcing it to take sooo long.

I often query a few million records, but I don’t pull them all into
memory a once, it’s too inefficient, and not just in ruby.

Zach
ActiveRecord::Extensions - http://www.continuousthinking.com/are

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.3 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFFU6hbMyx0fW1d8G0RAliUAJ9ZKjCgqjkkKV/woBQP8ZQl5LHyxACeOzW7
c2QQUvmdeg4bapbkz6OMepg=
=lvq7
-----END PGP SIGNATURE-----

On 11/9/06, zdennis [email protected] wrote:

Do you need all of the million records in memory at once? You’re getting
overhead of ActiveRecord creating one million records and
keeping them in memory, and your program is probably ending up using swap
which is forcing it to take sooo long.

I often query a few million records, but I don’t pull them all into memory
a once, it’s too inefficient, and not just in ruby.

I think he’s saying there are a million rows in the table. A million
results
grouped by day would mean he has records spanning nearly three millenia
:wink:

jeremy

On Nov 9, 2006, at 11:14 PM, zdennis wrote:

I often query a few million records, but I don’t pull them all into
memory a once, it’s too inefficient, and not just in ruby.

What do you do then? Paginate by hand?

– fxn

On Nov 9, 2006, at 11:14 PM, zdennis wrote:

I often query a few million records, but I don’t pull them all into
memory a once, it’s too inefficient, and not just in ruby.

What do you do then? Paginate by hand?

Yes :slight_smile:

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Jeremy K. wrote:

I think he’s saying there are a million rows in the table. A million
results grouped by day would mean he has records spanning nearly three
millenia :wink:

My bad. =)

Zach
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFFVKavMyx0fW1d8G0RAmUTAJ41LSS002nIj25G4I3OvgzJYA2YVACfXKwD
hPzN8yzBDhE1For8BWYS0Xg=
=8rut
-----END PGP SIGNATURE-----