I have a site on a shared host at TextDrive. I have an import process
that imports a CSV file into the database using a ruby script executed
using script/runner. TextDrive has a memory limit which I am hitting
and the process gets killed. I get through to about 400 records before
it dies. I guess each time I instantate a new object for each record I
am using more memory but thought I was using reusing the same memory
allocated. Anyway, I’m a little new with Ruby and Rails so if you can
tell me where I’m going wrong that would be much appreciated. I have
included the code below. Thanks!
csv_file = CSV.open(CSV_PATH, “r”)
csv_file.each do | pattern_temp |
unless pattern_temp[0].nil?
pattern_record = {
:code => pattern_temp[0].delete("\""),
:designer => pattern_temp[1],
:name => pattern_temp[2],
:designs => pattern_temp[3],
:price => pattern_temp[4].delete("$"),
:set => pattern_temp[5].delete("\"\r\n"),
:set_price => pattern_temp[6].delete("$"),
:themes => pattern_temp[7],
:bundle_with_code => pattern_temp[8],
:matrix => pattern_temp[9],
:description => pattern_temp[10],
:active => pattern_temp[11],
:panel => pattern_temp[12]
}
puts "Processing #{pattern_record[:code]}..."
pattern = Pattern.find_by_pattern_code(pattern_record[:code]) ||
Pattern.new
if pattern_record[:active] == "Yes"
add_pattern(pattern, pattern_record) # function
elsif pattern_record[:active] == "No" and !pattern.new_record?
puts "Removing #{pattern_record[:code]}..."
Pattern.destroy(pattern.id)
end
end
end