-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Heist wrote:
SELECT DAY(date_time) “day”, COUNT( * ) “count”
FROM table
WHERE (id = some_id)
AND(YEAR(date_time) = 2006)
AND(MONTH(date_time) = 8)
AND(some_string IN (‘some short string’) )
GROUP BY DAY( date_time );
Around 6. select_all gave results similar to SQLyog (around 15
seconds).
It’s not possible to use YEAR(date_time) as an index, is it?
Nope. You’re losing ability for the query optimizer to take advantages
of indexes, by determining the name of your column using a
function. Functions should always go on the right side of a comparison
in a query, so you can take advantage of indexes.
The way you’re doing the query above forces that you’re going to do a
FULL table scan each time.
If I alter
my table to have different columns for the year, month and day, will
there be a good performance gain?
Yes, and add indexes on those columns.
And while this is all good advice, I still don’t understand why Rails
would have such problems with the query.
ActiveRecord doesn’t have to bad of overhead for running a query.
ActiveRecord thinly wraps a MySQL database connection. The
overhead that ActiveRecord adds is in instantiating all of the results
into ActiveRecord model instances. This is why in the first
response to your post I commented on NOT trying to query results where
the result set was a million objects.
I dont’ think you’re seeing an ActiveRecord issue here, now that I read
your query more closely, the query needs to be written
better and you need to use take advantage of indexes.
Zach
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFFVKZtMyx0fW1d8G0RAkzcAJ48BaW+o/XJwvCecPrnFC3WTsgf3ACfcxNj
etliGjLlH/i78WnvhjlnZCk=
=8lma
-----END PGP SIGNATURE-----