Rails action using an insane amount of MySQL/CPU

Hey guys,

I have a very large rails app running on 4 instances of mongrel on a P4
server with 1GB RAM. Not the absolute best setup, but, the server has
been optimized and the application has been running extremely fast for
the past few months.

I’m having one serious problem however…Theres a specific action that
uses 99% of MySQL on the server and in most cases, doesn’t even load, I
end up with a “520 Proxy Error, Reason: Error reading from remote
server” error from mongrel.

The action in question really is like any other, It grabs images tagged
with a specific tag and paginates them:

@image_pages, @images = paginate :images,
:include => “tags”,
:conditions => ([“tags.name = ?”,
params[:tag]]),
:order_by => “images.created_at
DESC”,
:per_page => 12

I can’t figure out what’s going on, it seems to run fine on my
localhost, but simply kills the production server. The rails log looks
normal, the query is very large after the associations, but I have
similar queries that run just fine.

How could I go about figuring out what the problem is?

Any help would be extremely appreciated!

The action in question really is like any other, It grabs images tagged
I can’t figure out what’s going on, it seems to run fine on my
localhost, but simply kills the production server. The rails log looks
normal, the query is very large after the associations, but I have
similar queries that run just fine.

How could I go about figuring out what the problem is?

Any help would be extremely appreciated!

How many images/tags are in the database? paginate is not very
efficient
when it comes to large datasets as it grabs them all (at least that’s my
memory).

I’d take a look at the dev log and the SQL being generated and see what
it
is that’s going on. Perhaps pass those queries into mysql prefixed with
"EXPLAIN " to see if it’s using your indexes or not…

-philip

On 6/7/07, Danny C. [email protected] wrote:

                                 :conditions => (["tags.name = ?",

How could I go about figuring out what the problem is?
The problem is that paginate uses :limit and :offset which can’t work
well with eager joins (your :include => ‘tags’) since all the matching
rows (perhaps thousands or millions) have to be pulled into Ruby,
parsed, then limited and offset. To put it mildly, this does not
scale.

Remove the :include => tags to regain nearly all your performance at
the minor cost of 12 additional queries to pull tags per page.

Generally speaking, to troubleshoot database issues, look at the slow
queries in your production.log and use EXPLAIN in MySQL to
see why they’re performing poorly. Luckily, in this case, it’s just a
matter of returning way too much data.

jeremy

Danny,

1/
Try out the query analyzer plugin :
http://agilewebdevelopment.com/plugins/query_analyzer

In no time you’ll know if your are missing some obvious indexes, the
common cause for DB slowness.

… and read this :
Indexing for DB performance
http://weblog.jamisbuck.org/2006/10/23/indexing-for-db-performance

Alain R.

http://blog.ravet.com

I would also recommend using the paginator gem instead of the built-in
pagination of Rails. It is much more efficient. SInce I swapped them
out, I’ve had no problems.

gem install paginator

documentation: http://paginator.rubyforge.org/

Alain R. wrote:

Danny,

1/
Try out the query analyzer plugin :
http://agilewebdevelopment.com/plugins/query_analyzer

In no time you’ll know if your are missing some obvious indexes, the
common cause for DB slowness.

… and read this :
Indexing for DB performance
http://weblog.jamisbuck.org/2006/10/23/indexing-for-db-performance

Alain R.

http://blog.ravet.com

Amazing! Each post got me a bit closer to understating the problem.

It was definitely missing indexes, there are about 4,000 tags in the
table, hence it was doing a full scan to find a matching result(s).

I added 2 indexes in the images_tags table, on image_id and tag_id which
seems to have drastically sped things up. However, theres still room for
improvement.

The query analyzer plugin is awesome, since it breaks down all your
queries in the production log.

Thanks for all the help!

that would work, just be sure to update the links for next/prev with
the new offset and limit

Chris H. wrote:

I would also recommend using the paginator gem instead of the built-in
pagination of Rails. It is much more efficient. SInce I swapped them
out, I’ve had no problems.

gem install paginator

documentation: http://paginator.rubyforge.org/

Have you had any experience using this in conjunction with ajax?

I assume you could just change the “link_to next/prev page” to a
link_to_remote, and reload a div with the actual items.

Also if you have all your images in the database, this could be the
main cause for the slowness, send back so much data can take a while,
even if the actual query is quick. I don’t know it would be possible,
but consider maybe using Amzon S3 service if your number of image will
increase and the number of requests increase. Then just keep a
reference of the names in database, this should inprove the
effecientcy more.

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs