Forum: Rails deployment Handling Large Data Imports

Announcement (2017-05-07): is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see and for other Rails- und Ruby-related community platforms.
James H. (Guest)
on 2007-06-06 22:16
(Received via mailing list)

Have any of you guys dealt with importing large amounts of data on
deploy?  I have two distinct situations I need to deal with and I was
hoping for some advice.

1.  I have a fairly large CSV file of postal codes that seems to
import very well with the MySQL import utility (mysqlimport) on
development machines but very poorly when used remotely with
Capistrano (i.e. during initial migration).  How are you guys handling
such data import on initial migration?

2.  I'm going to have to move over a rather large amount of data from
an older version of the site I'm updating (old site is written in ASP
using a MS SQL database -- it's a mess).  I know I can write scripts
to massage data, but something I'm unsure of is how to actually *move*
the data from the old servers to the new servers -- neither of which
I'll have physical access to.  Recommendations?


James H
Jesse P. (Guest)
on 2007-06-06 22:17
(Received via mailing list)
Can you give some performance #s on the remote side?  Is that remote
database under load during the import?


Jesse P.
Blue Box Group, LLC

p. +1.800.613.4305 x801
e. removed_email_address@domain.invalid
senthil (Guest)
on 2007-06-07 10:43
(Received via mailing list)
Hi James,

Different situations require different techniques.

if you are looking for ruby based dumping use fastercsv library

or plainly you can use mysqldump of the table or db itself from the
development DB itself and upload and dump it into production server.

I have done it few times myself with a million+ records

these scripts do take significant amount of time and we do it only
once in a while so I wont suggest CAP as script may timeout

This topic is locked and can not be replied to.