I dont know if it may be of help, but I had a similar situation, where
I have to transfer data in csv (actually newline and tab delimited
text) into a database model. The approach I have taken, which has
served me well was to create a model just for the imported data.
Creating a method for each field. The structure of the data was such
that for me it was best to use split functions rather than try to
parse the data.
Now I can take a row from the incoming file, and call new on the csv
model. This then splits the data and holds it in a partial form. eg.
I have to initially split multiple records that are separated by \n–
\n etc. I can then decide to hold partially in the best form for the
structure I am dealing with. Then I can pull out the fields as I need
The data contains Customer, Order and Lineitem information. So
internally in the imported data model, I hold the customer fields as
an array, the Order fields as an array, and the Lineitems as an array
of Class LineItem ( a Class within the imported order model.
Also then I have included in the model, methods for order_fields,
customer_fields and line_item fields. These return a hash of the
fields which can be used to create a new instance of the database
By using a model for the csv data, it makes it easy to test and
maintainn being easier to adapt if the csv format changes. And means
the controller part of the code is kept clean as follows:
@data_records.each do |row|
# create the order and line_items
line.line_items.each do |item|
I have stripped out some extra stuff to test for different types of
order, count updated records and provide logging etc. I have also
changed the naming to make more general, so I hope the above is
correct. Anyway, the point I am trying to make is that if you
implement the right methods at the model level, it should be fairly
simple to create the database records based on that model.