I have a requirement to fetch a data feed from our central bank and set
a variety of currency exchange rates from that feed. My question is how
does one approach behavioural driven design with autonomous automated
processing? What I started with is this:
Feature: Automatically Retrieve and Store Foreign Currency Exchange
Rates
In order to accurately set foreign currency exchange rates daily
The automated system
Should automatically retrieve and store central bank exchange rates
To Reduce Costs and Protect Revenue
Scenario: Retrieve Exchange Rates from the Bank of Canada RSS feed
Given an RSS feed
“http://www.bankofcanada.ca/rss/fx/noon/fx-noon-all.xml”
When I access the RSS feed
Then I should see “Bank of Canada: Noon Foreign Exchange Rates”
And I should see today’s date in “yyyy-mm-dd” format
Now, while I have a pretty good idea how I am going to accomplish this,
via cron and a standalone Ruby script, I am sort of perplexed about how
I should construct the feature step definitions and how I would test
this. I figure I just have to take as given that cron works because
that is not my code. But how best to test that the rest works?
The automatic script will contain not much more than a list of the
external libraries a call to a methods in class that I have named
ForexCASource, loop through and map the returned array of hashes to the
corresponding CurrencyExchangeRate attributes of a new row, call #save!
and repeat until finished.
To Reduce Costs and Protect Revenue
I should construct the feature step definitions and how I would test
this. I figure I just have to take as given that cron works because
that is not my code. But how best to test that the rest works?
I would probably approach it this way.
Make a local copy of the RSS feed you’re expecting to import
Write a feature which specifies the command you want to run. This
command would be the same command you used with cron. I would make it
take a path or URI as an argument.
Have the feature ensure the the right number of records were created
based on your local copy of the feed
Drop down to RSpec and re-use the local copy of the feed to ensure
that all of the little details (if you have little details) are
properly imported
When that is done I would probably write a feature that actually hits
the Bank of Canada’s site, and I would set this up to run on
continuous integration only. Maybe using Cucumber tags or a directory
structure, and a new rake task to help isolate CI only features from
the rest.
Now when I run features locally I can be very very confident
everything works, and if CI fails one day because the Bank of Canada
changed their format, now you know you need to pull over a new copy of
the feed locally, and update your local features and specs.
That general approach has worked well in the past (and I say past
because before Cucumber existed I still wrote features/scenarios with
StoryRunner, and before that my team had tooled its own ad hoc story
runner).
The automated system
Now, while I have a pretty good idea how I am going to accomplish this,
command would be the same command you used with cron. I would make it
take a path or URI as an argument.
Have the feature ensure the the right number of records were created
based on your local copy of the feed
Drop down to RSpec and re-use the local copy of the feed to ensure
that all of the little details (if you have little details) are
properly imported
+1 I have done this in the past and it worked well. If you haven’t
already checkout the fakeweb library for stubbing out the calls to
Net::HTTP:
�To Reduce Costs and Protect Revenue
I should construct the feature step definitions and how I would test
this. �I figure I just have to take as given that cron works because
that is not my code. �But how best to test that the rest works?
I would probably approach it this way.
Make a local copy of the RSS feed you’re expecting to import
Yup, done that. Mainly because I now have fixed data to test against but
in the exact format that the feed supplies.
Write a feature which specifies the command you want to run. This
command would be the same command you used with cron. I would make it
take a path or URI as an argument.
Ahh. I had not thought of that.
Have the feature ensure the the right number of records were created
based on your local copy of the feed
K.
Drop down to RSpec and re-use the local copy of the feed to ensure
that all of the little details (if you have little details) are
properly imported
When that is done …
Will do.
That general approach has worked well in the past (and I say past
because before Cucumber existed I still wrote features/scenarios with
StoryRunner, and before that my team had tooled its own ad hoc story
runner).
HTH
It certainly does. Thanks.
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.