About using WWW::Mechanize plugin

Hi all,

Does anyone have the experience of using Mechanize to scrape

webpages? I encountered a problem using Mechanize to parse enormous
number of pages. As each time I use agent.get function to fetch a
page, it keeps the log, and thus as time grows, the object size of the
agent grows bigger and bigger, and consuming all my memory (4G). Is
there any solution to this problem?

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs