Forum: Ruby on Rails About using WWW::Mechanize plugin

Announcement (2017-05-07): is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see and for other Rails- und Ruby-related community platforms.
parkurm (Guest)
on 2008-11-15 21:42
(Received via mailing list)
Hi all,

    Does anyone have the experience of using Mechanize to scrape
webpages? I encountered a problem using Mechanize to parse enormous
number of pages. As each time I use agent.get function to fetch a
page, it keeps the log, and thus as time grows, the object size of the
agent grows bigger and bigger, and consuming all my memory (4G). Is
there any solution to this problem?
This topic is locked and can not be replied to.