My first serious Ruby on Rails project will automatically display information on the ETFs and mutual funds with the lowest price/book and price/cash flow ratios and also provide other essential information for value investors like the expense ratio, the annual portfolio turnover rate, and the size (in %) of the largest investment in the portfolio. This project will be analogous to the stock screening system of Doppler Value Investing (http://www.dopplervalueinvesting.com/screen), which uses a combination of Python and Drupal. Every night, the Python script automatically scrapes data from the Smartmoney web site, calculates the Dopeler price/book ratio for each stock, saves the information in a *.csv file, and copies this *.csv file to the Drupal web site so that it's publicly accessible. My Doppler Value Investing web site already works, so I'm not about to redo it. So I'll be using Ruby on Rails for a brand new project instead. Based on what I did for Doppler Value Investing, I'd be inclined to have a separate web site (in Ruby on Rails) and a Ruby script that produces an output in *.csv and *.html files and copies these output files to the Ruby on Rails web site. However, I get the feeling that it would be better to integrate the web site and the number-crunching script. Given all this, if my web site dedicated to finding the most undervalued ETFs and mutual funds were your project instead of mine, how would you go about it?
on 2013-02-17 01:49
on 2013-02-17 03:54
On 02/16/2013 05:48 PM, Jason Hsu, Android developer wrote: > Python script automatically scrapes data from the Smartmoney web site, > it would be better to integrate the web site and the number-crunching > script. > > Given all this, if my web site dedicated to finding the most > undervalued ETFs and mutual funds were your project instead of mine, > how would you go about it? > > My inclination would be to have a seperate task (cron job maybe) that scraped the data and put it into the database. I would then have the rails web site just calculate any needed factors from the data in the database and sort and display the info. YMMV
on 2013-02-17 07:15
On Saturday, February 16, 2013 8:52:54 PM UTC-6, Norm wrote: > > My inclination would be to have a seperate task (cron job maybe) that > scraped the data and put it into the database. I would then have the > rails web site just calculate any needed factors from the data in the > database and sort and display the info. YMMV > > Thanks, Norm. Yes, I use a cron job to run the web-scraping/number-crunching Python scripts on the Doppler Value Investing web site, and I anticipate using a cron job to run a Ruby script every night to run the web-scraping/number-crunching script on my ETF/mutual fund analysis site. Thanks for suggesting the use of a database to be populated by the web-scraping/number-crunching script and retrieved by the Rails web site. This will be my first experience working with a web site's database.
on 2013-02-17 16:35
I would do just like I've done for automated tasks in several other Rails apps I've written: make a rake task for it which can be triggered via cron job. The rake task can simply trigger a model method to do whatever needs done. This keeps all code relevant to the app in the application (and more specifically, all code relevant to that model actually in the model), allows you to use ActiveRecord instead of interacting directly with the database via sql, and gives you easy access to any other functionality you build into your app. It also gives you the ability to more easily allow a user to trigger a partial update via the web site should that functionality ever be desired. On Saturday, February 16, 2013 7:48:21 PM UTC-5, Jason Hsu, Android