Forum: Ruby on Rails How do I find what's using so much memory?

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
Pat M. (Guest)
on 2007-07-06 08:19
(Received via mailing list)
I've got an app that is eating up lots of memory.  When it first
starts up, it uses ~125 megs for a mongrel process, which isn't
terrible I suppose.  However after being used for a while it balloons
to 250-350 megs.  It would appear there's a big memory leak somewhere,
but I have no clue where to even begin looking.  Any ideas?

Pat
snacktime (Guest)
on 2007-07-06 09:34
(Received via mailing list)
On 7/5/07, Pat M. <removed_email_address@domain.invalid> wrote:
>
> I've got an app that is eating up lots of memory.  When it first
> starts up, it uses ~125 megs for a mongrel process, which isn't
> terrible I suppose.

That's huge, are you sure that's ram and not virtual memory?

  However after being used for a while it balloons
> to 250-350 megs.  It would appear there's a big memory leak somewhere,
> but I have no clue where to even begin looking.  Any ideas?
>

Pin down when it started and then find out what changed at that time.
That's where I would start.  Normally I have at least an idea of where
something like that is coming from.  Do you have large data sets you
are processing in some way, or retrieving through activerecord?

Chris
Florian G. (Guest)
on 2007-07-06 09:48
Pat M. wrote:
> I've got an app that is eating up lots of memory.  When it first
> starts up, it uses ~125 megs for a mongrel process, which isn't
> terrible I suppose.  However after being used for a while it balloons
> to 250-350 megs.  It would appear there's a big memory leak somewhere,
> but I have no clue where to even begin looking.  Any ideas?
>
> Pat

Do you use RMagick? If so, try using GC.start to start Garbage
Collection regulary.

Just to rule it out...

Greetings
Skade
Pat M. (Guest)
on 2007-07-06 12:15
(Received via mailing list)
On 7/5/07, snacktime <removed_email_address@domain.invalid> wrote:
>
> On 7/5/07, Pat M. <removed_email_address@domain.invalid> wrote:
> >
> > I've got an app that is eating up lots of memory.  When it first
> > starts up, it uses ~125 megs for a mongrel process, which isn't
> > terrible I suppose.
>
> That's huge, are you sure that's ram and not virtual memory?

I'm checking it out on my MBP, watching the process with the Activity
Monitor tool

When I first start it up, it uses 32 megs of Real Memory, 29 of
Private, and 77 of Virtual.


>   However after being used for a while it balloons
> > to 250-350 megs.  It would appear there's a big memory leak somewhere,
> > but I have no clue where to even begin looking.  Any ideas?
> >
>
> Pin down when it started and then find out what changed at that time.
> That's where I would start.  Normally I have at least an idea of where
> something like that is coming from.  Do you have large data sets you
> are processing in some way, or retrieving through activerecord?

Here's one line where I found something weird:

Company.top(10).collect {|c| c.complete_videos.size}

class Company < ActiveRecord::Base
  def self.top(n)
    find_by_sql ["SELECT companies.*, COUNT(videos.company_id) AS
num_videos FROM companies LEFT JOIN videos ON
companies.id=videos.company_id WHERE videos.status='complete' AND
videos.deleted_at IS NULL GROUP BY companies.id, companies.name,
companies.nickname, companies.plan_id, companies.deleted_at ORDER BY
num_videos DESC LIMIT ?", n]
  end

    def complete_videos(options = {})
    @complete_videos ||= Video.find :all, { :conditions =>
"company_id=#{id} AND status='complete'" }.merge(options)
  end
end

When I run that line in console, I get the proper results, and there's
no memory spike.

One thing that's really weird though is that if I run
2.times { Company.top(10).collect {|c| c.complete_videos.size} }

then it jumps to 48/45/93 and never comes back down.

My only guess is that the companies and videos are kept in a closure
and not released.

In the rails app itself, the offending code is:
<% @top_companies.each do |c| %>
    <tr><td><%= link_to c.name, company_url(c) %></td><td><%=
c.complete_videos.size %></td></tr>
  <% end %>

I'm not doing anything like 2.times.. there.

One thing I have checked out is
http://scottstuff.net/blog/articles/2006/08/17/mem...
  When I collect the top company's video sizes, it increases the
number of Video objects by 5689, but I never see it come back down.

I just noticed that if I run
Company.find(2).complete_videos.size

twice in quick succession, the memory jumps up to the high 49/46/94.

So maybe it has to do with how quickly a connection is being made to
the database?

I'm using Rails 1.2.3, Ruby 1.8.6, and PostgreSQL 8.2.3.

Pat
Pat M. (Guest)
on 2007-07-06 12:25
(Received via mailing list)
On 7/6/07, Pat M. <removed_email_address@domain.invalid> wrote:
> So maybe it has to do with how quickly a connection is being made to
> the database?

I want to elaborate on this a tad.  When I run
Company.find(2).complete_videos.size, the MemoryProfiler class reports
3744 new Video objects (which is what complete_videos.size returns).
I never see the number be reduced.  However, the memory usage doesn't
change at all.

When I run 2.times { Company.find(2).complete_videos.size }, it
reports another 3744 Videos for a total of 7488.  Memory usage
increases slightly to 35/32/81.

When I run 2.times { Company.find(2).complete_videos.size } one more
time, there's no change in Video objects, but the usage jumps up to
50/47/95.
Pat M. (Guest)
on 2007-07-06 13:08
(Received via mailing list)
On 7/6/07, Pat M. <removed_email_address@domain.invalid> wrote:
> So maybe it has to do with how quickly a connection is being made to
> the database?
>
> I'm using Rails 1.2.3, Ruby 1.8.6, and PostgreSQL 8.2.3.

I just dumped all the data into mysql and tried it, same issue.  So
it's not a mysql vs postgresql thing.

Pat
Pat M. (Guest)
on 2007-07-07 01:50
(Received via mailing list)
On 7/5/07, snacktime <removed_email_address@domain.invalid> wrote:
> Do you have large data sets you are processing in some way, or
> retrieving through activerecord?

You specifically asked about this, so perhaps you know something I
don't...but anyway, some of the data sets are in the 10k range, and
yes they're coming through activerecord.

I don't understand why the memory isn't being freed up though.

Am I just misunderstanding how Ruby's memory allocation works?  If a
mongrel needs 300 megs, then it'll hold onto that forever?  In that
case do I just need to allow very high memory limits?  That doesn't
seem right to me...

Pat
Ezra Z. (Guest)
on 2007-07-07 01:56
(Received via mailing list)
On Jul 6, 2007, at 2:49 PM, Pat M. wrote:

>
> Am I just misunderstanding how Ruby's memory allocation works?  If a
> mongrel needs 300 megs, then it'll hold onto that forever?  In that
> case do I just need to allow very high memory limits?  That doesn't
> seem right to me...
>
> Pat


  You are going to see massive memory usage like this anytime you load
more then 500 -1000 activerecord objects at once. There is quite a
bit of memory overhead per AR object instantiation so ruby allocates
a bunch or ram to hold the records. Once the dataset is no longer
needed ruby doesn't like to give ram back to the OS, instead it tends
to keep the memory around for use by the interpreter later.

  Do you actually need 10k AR objects at once? You are going to need a
lot of ram to handle result sets like that. If there is any way to do
more calculation in the dn itself and pull smaller sets at a time you
wil keep your memory usage lower. But 10k AR's is going to eat the
ram hard.

Cheers-

-- Ezra Z.
-- Lead Rails Evangelist
-- removed_email_address@domain.invalid
-- Engine Y., Serious Rails Hosting
-- (866) 518-YARD (9273)
Pat M. (Guest)
on 2007-07-07 02:05
(Received via mailing list)
On 7/6/07, Ezra Z. <removed_email_address@domain.invalid> wrote:
> > don't...but anyway, some of the data sets are in the 10k range, and
>
>
>         You are going to see massive memory usage like this anytime you load
> more then 500 -1000 activerecord objects at once. There is quite a
> bit of memory overhead per AR object instantiation so ruby allocates
> a bunch or ram to hold the records. Once the dataset is no longer
> needed ruby doesn't like to give ram back to the OS, instead it tends
> to keep the memory around for use by the interpreter later.

Okay, that's what I suspected.  I just wasn't sure about that.


>         Do you actually need 10k AR objects at once? You are going to need a
> lot of ram to handle result sets like that. If there is any way to do
> more calculation in the dn itself and pull smaller sets at a time you
> wil keep your memory usage lower. But 10k AR's is going to eat the
> ram hard.

You're right, we definitely don't.  There's plenty of nasty code that
I need to get cleaned up :)

Thanks for the explanation.

Pat
Ezra Z. (Guest)
on 2007-07-07 02:21
(Received via mailing list)
On Jul 6, 2007, at 3:04 PM, Pat M. wrote:

>
>
> Thanks for the explanation.


  You may find this explains the GC a bit more:

http://whytheluckystiff.net/articles/theFullyUptur...

Cheers-

-- Ezra Z.
-- Lead Rails Evangelist
-- removed_email_address@domain.invalid
-- Engine Y., Serious Rails Hosting
-- (866) 518-YARD (9273)
Pat M. (Guest)
on 2007-07-07 03:36
(Received via mailing list)
On 7/6/07, Ezra Z. <removed_email_address@domain.invalid> wrote:
> >> needed ruby doesn't like to give ram back to the OS, instead it tends
> >> ram hard.
> >
> > You're right, we definitely don't.  There's plenty of nasty code that
> > I need to get cleaned up :)
> >
> > Thanks for the explanation.
>
>
>         You may find this explains the GC a bit more:
>
> http://whytheluckystiff.net/articles/theFullyUptur...

Thanks.  One thing I don't see it addressed is what happens to the
memory when the GC runs.  My current understanding is that it's just
cleared out so that Ruby can store later objects in it, rather than
being returned to the OS.  So if I have a big query that bumps the
usage up to 250 megs, after GC runs the ruby process may only need
125, but it's still holding onto the full 250.  Is that accurate?

Pat
Massimo S. (Guest)
on 2007-08-20 01:14
Pat M. wrote:
> On 7/6/07, Ezra Z. <removed_email_address@domain.invalid> wrote:
>> >> needed ruby doesn't like to give ram back to the OS, instead it tends
>> >> ram hard.
>> >
>> > You're right, we definitely don't.  There's plenty of nasty code that
>> > I need to get cleaned up :)
>> >
>> > Thanks for the explanation.
>>
>>
>>         You may find this explains the GC a bit more:
>>
>> http://whytheluckystiff.net/articles/theFullyUptur...
>
> Thanks.  One thing I don't see it addressed is what happens to the
> memory when the GC runs.  My current understanding is that it's just
> cleared out so that Ruby can store later objects in it, rather than
> being returned to the OS.  So if I have a big query that bumps the
> usage up to 250 megs, after GC runs the ruby process may only need
> 125, but it's still holding onto the full 250.  Is that accurate?
>
> Pat

Pat,

I'm exepriencing the same problem in my RAILS application.
I seldomly need to retrieve about 10K records, and my applications bumps
to 400MB... and never releases the memory again.
Did you find a way to solve this problem, or I have to rewrite my code?

Thanks
Massimo
This topic is locked and can not be replied to.