Hi im making a simple app which takes a users search term and scrapes several sites rss feeds for relevant results. The scrape can take upto 10seconds. I want to make this a background process so I can update the user with the scrapes status("currently scraping site x") and present results as they are scraped. Im looking at backgroundrb and have read a guide in Advanced Rails Recipe however thats for an older version of backgroundrb and the api has changed. So im a little confused - im also new to processes and threads. If my worker is responsible for scraping each site and providing updates do I have one worker which will be used by all the users currently online? I guess I then can use threads to process multiple search requests at times when the site is busy. OR do i create a new worker for each user? Would this also allow for concurrent processing?
on 2009-05-13 18:01
on 2009-05-13 18:09
I would use Workling + Starling, or the run_later plugin. You can find all of them at Github. On Wed, May 13, 2009 at 4:01 PM, Adam A.
on 2009-05-13 18:40
can they perform tasks dynamically i.e. a task is initiated on a users mouse click. I thought i saw a railscast that said they were only good for prescheduled tasks.
on 2009-05-13 19:00
On Wed, May 13, 2009 at 4:40 PM, Adam A. <firstname.lastname@example.org> wrote: > can they perform tasks dynamically i.e. a task is initiated on a users > mouse click. I thought i saw a railscast that said they were only good > for prescheduled tasks. I've used them to send process to background, for a long running task, for example deliver a newsletter to thousands of users, which is a process which can take some time.