Having just started with Rails, I’m probably going to ask on obvious
question. I’ve looked through the archives and googled every
combination of words relating to the subject at hand that I can think
of, and I’ve kind of got an answer, but I’d like to be absolutely sure
that the behaviour I’m seeing is the correct one.
How exactly does Rails/ActiveRecord cache metadata it fetches from the
database on table structure, column information, etc.? I’m under the
impression that it’s on a per-request basis – once it has the
metadata, ActiveRecord won’t fetch the same metadata again for the
rest of the request.
Is this the only behaviour that ActiveRecord has out of the box, or is
there a way to store this data in a way that doesn’t require
constantly hitting the database for it? For many of my page requests,
I seem to be spending more time looking up column information than I
actually do fetching the actual data.
My set up looks like this:
- Rails 1.0
- PostgreSQL 8.1.3
- mod_ruby 1.2.4
- apache 1.3.34
I have switched over my app to use the production settings by setting
the appropriate environment variable in environment.rb, and I’ve
confirmed that it’s connecting to my production database by looking at
the production.log and the PostgreSQL logs, but I have noticed
absolutely no change in the number of queries compared to the
development environment. Shouldn’t I be noticing a drop off in the
number of queries that involve looking up table structures and such?
(I realize it’s going to depend on my database structure, but it seems
odd that there’s been zero change from the development configuration.
To me, at least, and I will plainly admit I’m a noob to Rails, so my
view on the situation is obviously skewed.)
Is there any way to actually cache this metadata so I can cut down on
the number of queries that are being performed? Now that I’m in the
production environment, it doesn’t make sense (again, to me) that this
metadata is constantly being fetched on every request, when it could
really be fetched once and written down in some binary file that
ActiveRecord can go to rather than trying up the database. The
metadata queries according to PostgreSQL’s EXPLAIN ANALYZE statements
are actually taking longer to perform than the actual request queries
themselves, and I curious to know if a cached version of the same data
in a file would be quicker than what I’m currently experiencing.
Also, as a sidebar question dealing with performance, Rails seems to
like generating queries like this:
SELECT * FROM whatever WHERE (whatever.id = 9) LIMIT 1
Without the LIMIT clause, there’s an average time savings of about 10%
to 15% per query. There’s really no need for the LIMIT clause here,
since Rails should assume it’s only going to get at most a single
record returned when it’s querying against the primary key for the
table. 10% to 15% might not seem like much when we’re dealing in
queries that take less than a tenth of a second, but they can really
add up, and every little bit helps. (It’s strange that PostgreSQL’s
query optimizer doesn’t realize this either, so maybe I’m looking in
the wrong place for optimization…)
Wow, long first post. I’ll wrap it up by saying I’m enjoying working
with Rails and Ruby so far, and that y’all should go easy on a noob.