Migration: upload a large set of data via a separate sql fil

Hi all,

I need to upload a large set of data (the cvs file is about 70MB) into
some look up tables. Using standard migration api would be tedious. I
know that there’s a way to export data into YAML file and populate
data using this file. However, I think this is still heavy.

Is there a way to upload data using a separate sql file in migration?

Thanks

You can use the approach described here for loading a straight SQL
file from migrations:

Val

On May 11, 2007, at 7:04 PM, Chuong Huynh wrote:

Thanks
This is directly from one of my migrations:

Get all that research data loaded!

def self.up
say_with_time(“Create Tables and Indexes…”)
{ create_tables_and_indexes }
say “Reset all models”; reset_all_models
say_with_time(“Import all data…”)
{ import_data }
end

The import_data method used FasterCSV to load a 1.6Mb CSV file into
three tables with ActiveRecord models. The thing to remember is that
you can do anything in your migrations. If the data were already
in an SQL file, use the Revolution Health approach. If your data is
in CSV, use FasterCSV. The “migration api” only deals directly with
the schema changes.

Do what makes sense in your situation.

-Rob

Rob B. http://agileconsultingllc.com
[email protected]