Unique RoR Design Issue

Hi Folks,

I am faced with a unique RoR design Issue and thought of getting some
advice on how go about this requirement.

I need to:
Create a web application to

  1. login / Logout/ Manage Users
  2. Register queries with boundaries for reporting.
  3. Work flow engine to ensure BP
  4. Scheduler to execute jobs on Legacy DB
  5. Get Transactions from Legacy DB and store in application DB.
  6. Once approved, send email reports with Transactions.
  7. Store transaction data from Legacy DB to application DB - back up
    storage.
  8. Application users will be able to search queries and have a look
  9. Transaction data is not needed to be shown from the Web GUI though.

I am stuck here.

  1. Read from Legacy table(s)
  2. Store transactions from legacy table into application table for later
    use.

What is the best way to read from legacy Oracle Tables which are
partitioned and Massive. What are the issues here as the legacy table
structure is unique and does not cater to RoR standards.

Can i use Ruby gems to fire “SQL select” and use RoR migration technique
to store results in to applicate table? Is that even possible? Or
should i create views in my application DB first and just do plain
reading from them?

I am not sure how to go about rading from legacy first, storing the
search results into application data!

Any inputs will be appreciated.
/BlueBull

Hi Again,

Just wanted to add:

The Legacy tables are hug, being used by umpteen other apps. hence, i
want to search once and store data within application table for
processing as and when required.
Is it viable to create models in my app, migrate search data for each
result row into the app table? Or is it just better to create dynamic
views within oracle DB everytime a query is initiated and use this data
for reporting purposes?

Cheers
/BlueBull

Blue Bull wrote:

Hi Folks,

I am faced with a unique RoR design Issue and thought of getting some
advice on how go about this requirement.

I need to:
Create a web application to

  1. login / Logout/ Manage Users
  2. Register queries with boundaries for reporting.
  3. Work flow engine to ensure BP
  4. Scheduler to execute jobs on Legacy DB
  5. Get Transactions from Legacy DB and store in application DB.
  6. Once approved, send email reports with Transactions.
  7. Store transaction data from Legacy DB to application DB - back up
    storage.
  8. Application users will be able to search queries and have a look
  9. Transaction data is not needed to be shown from the Web GUI though.

I am stuck here.

  1. Read from Legacy table(s)
  2. Store transactions from legacy table into application table for later
    use.

What is the best way to read from legacy Oracle Tables which are
partitioned and Massive. What are the issues here as the legacy table
structure is unique and does not cater to RoR standards.

Can i use Ruby gems to fire “SQL select” and use RoR migration technique
to store results in to applicate table? Is that even possible? Or
should i create views in my application DB first and just do plain
reading from them?

I am not sure how to go about rading from legacy first, storing the
search results into application data!

Any inputs will be appreciated.
/BlueBull

On 16, April 2010, at 16 Apr 13:42, Blue Bull wrote:

The Legacy tables are hug, being used by umpteen other apps. hence, i
want to search once and store data within application table for
processing as and when required.
Is it viable to create models in my app, migrate search data for each
result row into the app table? Or is it just better to create dynamic
views within oracle DB everytime a query is initiated and use this data
for reporting purposes?

When I had the same issue (except I also had an MS SQLServer in the mix
as well as my RoR app’s PostgreSQL database) I did a mix of things.

  1. views on the Oracle server for stuff that can be kept remote
  2. Local copy of static Oracle data done through a replication deamon
    that I wrote
  3. Queries on the local data for speed

Mikel L.

http://rubyx.com/

Hi Mikel,

Sounds like a viable approach which could be applied for my case too.

  1. There is no need for me to store remote txn data on the legacy oracle
    schema as it will be redundant and unnecessary duplication. I just want
    to be able to read 10 columns from legacy table and store it in
    application table to be used at a later point in time.

So, Using Ruby OCI to fetch data from Legacy Tables with Search Criteria
and i can write these into files initially for reports and also store in
secondary DB.
For storing into secondary DB?

  1. How to migrate or Write that into local application table.
    a) should i use migrations as i have models.
    b) or use just pure SQL to insert rows in to my tables?
    c) use this
    How and what did you do with your daemon? Does RoR provide with an
    efficient migration technique from legacy tables?

  2. Totally Agree.

As such there are other application related data which i will use
locally for faster processing. The only overhead is getting data from
legacy system ONCE and store it for local processing.

Any suggestions?
/Sudhir

Mikel L. wrote:

On 16, April 2010, at 16 Apr 13:42, Blue Bull wrote:

The Legacy tables are hug, being used by umpteen other apps. hence, i
want to search once and store data within application table for
processing as and when required.
Is it viable to create models in my app, migrate search data for each
result row into the app table? Or is it just better to create dynamic
views within oracle DB everytime a query is initiated and use this data
for reporting purposes?

When I had the same issue (except I also had an MS SQLServer in the mix
as well as my RoR app’s PostgreSQL database) I did a mix of things.

  1. views on the Oracle server for stuff that can be kept remote
  2. Local copy of static Oracle data done through a replication deamon
    that I wrote
  3. Queries on the local data for speed

Mikel L.

http://rubyx.com/
http://lindsaar.net/