I’m stuck again. I’m in my 5th month of Ruby on Rails and getting
I’m uploading a .csv file using Paperclip and FasterCSV, then processing
the file after the upload is complete.
Small files work fine. I tested a larger file and found that one cell
had 13,000 chars. needless to say the ORA-01704 was thrown.
When processing the file, I do line by line.
- controller -
@import = Import.find(params[:id])
lines = parse_csv_file(@import.csv.path)
lines.shift #comment this line out if your CSV file doesn’t contain
a header row
if lines.size > 0
@import.processed = lines.size
lines.each do |line|
flash[:notice] = “CSV data processing was successful.”
redirect_to :action => “show”, :id => @import.id
flash[:error] = “CSV data processing failed.”
render :action => “show”, :id => @import.id
lines = 
FasterCSV.foreach(path_to_csv) do |row|
lines << row
params = Hash.new
params[:irb] = Hash.new
params[:irb][“irb_number”] = line
params[:irb][“pi_full_name”] = line
params[:irb][“cr_quest_split”] = line
params[:irb][“cr_and_ct_split”] = line
params[:irb][“old_master_list”] = line
params[:irb][“title”] = line
params[:irb][“status_of_irb”] = line
params[:irb][“expiration_date”] = line
params[:irb][“review_level”] = line
params[:irb][“category”] = line
irb = Irb.new(params[:irb]) irb.save
The “title” is where the huge cell is.
I’ve done some research and I know that Oracle’s datatype ‘Clob’ will
hold the size but only at 4,000 chars. at a time.
I’m just wondering how to do it?
Thank you for any help with this.