Datamapper/CSV sync problem

Hi,
I’m trying to optimize synchronizing a DataMapper store to a csv file
capture. I want to destroy all records that gets deleted from the daily
csv file I’m reading. Currently, I’m just using a boolean field called
touch, and setting it to false before reading the file, setting it to
true when I see the record, then destroying all ‘touched = false’
records when I’m done.

Needless to say, this doesn’t scale well with 300k lines of CSV. There
an easier way that I’m missing?

Hi,

I solved this more or less in the same way, but I do not have to set a
flag in advance.

class ImportRun
include DataMapper::Resource
property :id,Serial
has n, :imported_items

def import
each_csv_attributes do |attributes|
attributes[:import_run] = self
item = Item.get(attributes.fetch(‘id’))
unless item
item = Item.new
end
item.attributes = attributes
item.save
end
Item.all(:import_run.not => self).destroy
end
end

class ImportedItem
include DataMapper::Resource

property :id,Serial

belongs_to :import_run
end

This way I can also save some metrics in the ImportRun. Hopefully this
helps you.

Regards,

Markus

I think I’m going to use this. Thanks!