I have a record called a PMR. Each PMR has a number of Signatures.
Each Signature has a number of Lines.
Currently I have a routine that is feed an entire PMR as a file and it
parses it into the proper structure and stores it.
The problem is that the PMR gross over time. I am receiving it from
another application. The easiest way to get it is to just get the
whole thing.
I can re-parse the whole thing and create the internal Ruby
structures. That process is pretty fast. But then what? Should I
fetch the old and compare record by record and update only what
changes? Or I could delete the old and save the new.
If you’re using MySQL look into ON DUPLICATE KEY UPDATE syntax which
will allow you to do a massive insert without clobbering existing rows
and instead update them how you see fit. To use it with ActiveRecord
you will need to install ActiveRecord::Extensions by Zach D…
Thanks. I’m using postgreSQL. Your suggestion got me to searching
though. I could push the “work” down into the database. But, then I
thought about it longer and it would only postpone the question:
delete or update.
PostgreSQL has Rules which one person used to implement ON DUPLICATE
KEY UPDATE.
Well only you can answer that question. I’ve used both approaches,
but the ON DUPLICATE KEY UPDATE advantage is an orders of magnitude
increase in speed for the update situation. But if you can get away
with truncating the table and just re-inserting, that is generally the
easier thing to do. I would still look at arextensions just for the
mass INSERT which offers a huge performance improvement when you are
importing thousands of rows.
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.