No duplicate records to model with FasterCSV

Hello all,
I’m not sure if this is the best place to post but…

Here’s what I’m doing.
I’m reading/saving a csv to a table. I have a user that will be doing
this periodically. The next time we get a csv, how can we not save
duplicate records to the table.

Here’s my controller

  • controller -

def import_irb_file
# set file name
file = params[:irb][:file]
rowcount = 0

  Irb.transaction do
    FasterCSV.parse(file,
                    :headers => true,
                    :header_converters => lambda { |h| h.tr(" ",

").delete("^a-zA-Z0-9”)},
:converters => :all ) do |row|
Irb.create!(row.to_hash)
rowcount += 1
end
end
# if successful then display, then redirect to index page
flash[:notice] = “Successfully added #{rowcount} project(s).”
redirect_to :action => :index

rescue => exception
  file_name = params[:irb]['file'].original_filename
  file_parts = params[:irb]['file'].original_filename.split('.')
  ext = file_parts[1]

  if ext != 'csv'
    error = "CSV file is required"
  else
    error = ERB::Util.h(exception.to_s) # get the error and HTML

escape it
end
# If an exception in thrown, the transaction rolls back and we end
up in this
# rescue block

  flash[:error] = "Error adding projects to IRB table. (#{error}).

Please try again. "

  redirect_to :controller => 'irbs', :action => 'new'

end

Thanks for helping.

JohnM