I have a record called a PMR. Each PMR has a number of Signatures. Each Signature has a number of Lines. Currently I have a routine that is feed an entire PMR as a file and it parses it into the proper structure and stores it. The problem is that the PMR gross over time. I am receiving it from another application. The easiest way to get it is to just get the whole thing. I can re-parse the whole thing and create the internal Ruby structures. That process is pretty fast. But then what? Should I fetch the old and compare record by record and update only what changes? Or I could delete the old and save the new. Any suggestions? Thanks.
on 2007-05-22 05:17
on 2007-05-22 05:26
If you're using MySQL look into ON DUPLICATE KEY UPDATE syntax which will allow you to do a massive insert without clobbering existing rows and instead update them how you see fit. To use it with ActiveRecord you will need to install ActiveRecord::Extensions by Zach Dennis.
on 2007-05-22 05:36
Thanks. I'm using postgreSQL. Your suggestion got me to searching though. I could push the "work" down into the database. But, then I thought about it longer and it would only postpone the question: delete or update. PostgreSQL has Rules which one person used to implement ON DUPLICATE KEY UPDATE.
on 2007-05-22 05:58
Well only you can answer that question. I've used both approaches, but the ON DUPLICATE KEY UPDATE advantage is an orders of magnitude increase in speed for the update situation. But if you can get away with truncating the table and just re-inserting, that is generally the easier thing to do. I would still look at arextensions just for the mass INSERT which offers a huge performance improvement when you are importing thousands of rows.
on 2007-05-22 06:08
Ok. It is probably better to do it in the DB somehow. The less data that passes from the DB to Rails and back, the better. So, I'll dig into the Rules stuff and see if I can figure it out. Thanks again.