Forum: Ruby Memory implications for persistent Proc objects

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
Farrel L. (Guest)
on 2008-11-09 00:14
(Received via mailing list)
What are the memory implications for keeping a Proc/closure around
especially those that are defined in a global scope? Is Ruby clever
enough not to keep objects in the Proc closure that are never
referenced in that closure? I'm worried where I have the following
scenario

big_object = BigObjectNeedsLotsOfRAM.new
my_proc = Proc.new { ... } # big_object not referenced in my_proc
big_object = nil
my_proc.call

Will the memory allocated to big_object remain in use because it was
alive when my_proc was created despite it not being referenced or used
in my_proc?

Farrel
Brian C. (Guest)
on 2008-11-09 10:11
Farrel L. wrote:
> big_object = BigObjectNeedsLotsOfRAM.new
> my_proc = Proc.new { ... } # big_object not referenced in my_proc
> big_object = nil
> my_proc.call
>
> Will the memory allocated to big_object remain in use because it was
> alive when my_proc was created despite it not being referenced or used
> in my_proc?

Clearly not, because big_object has been assigned to nil. That is:
big_object may exist in the environment of my_proc, but it's the *same*
big_object variable that you've assigned outside too.

Now, if you had not assigned to big_object, but just let it drop out of
scope, then I wouldn't be so sure. However you said explicitly that
you're only interested in what happens at the top level, and you can't
make things drop out of scope at that level.
This topic is locked and can not be replied to.