I have been contracted to work on an intranet using Rails to generate
content. Part of this work is to model and generate queries on the
legacy SQL Server 2000 database that contains about 80 tables. The
database itself is in 3NF form and follows an in-house naming
convention that I cannot change.
My question is this: would I need to model all the tables within this
database to be able to create the queries that I need under Rails? Or
would it be better to create stored procedures (not the Rails way)
within the database, and model a/some view(s) from the database?
Thanks for the link - I used the modules there, but i still have an
issue with registering the primary keys and the database names, for some
reason the set_table_name and set_primary_key calls are being ignored on
the tables that I am using to test, magic models didn’t seem to pick up
the primary keys or the foreign ones - looks like I will need to look
harder in the documentation.
Thanks for the response - looks like I am going to have to bite the
bullet and get on with it. From a Rails perspective, these databases are
a mess, in fact from a SQL Server perspective they are a mess and the
company knows it, what they won’t do is use some resource to clean them
From some initial testing, the framework seems to be ignoring the
set_primary_key setting that I have provided for the test table, but
maybe I have set the query up wrong or something - will go back and
I would just create models for each table (script/generate model
table_name), then get real familiar with ‘set_table_name’, and
‘set_primary_key’. If they are using compound primary keys, then
you’ll need Dr Nic’s ‘CompoundPrimaryKey’ gem.
I’d also recommend creating the associations manually and not using
the ‘magic models’ plugin.
I’ve done it with our in-house SQL Server db’s and it works fine…
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.