I need to store rather large data in the database using blobs. I’ve been
working with Rails and have designed a proof-of-concept setup that works
pretty well, except there’s a considerable performance hit with very
large files due to the fact that ActiveRecord loads the entire blob into
memory before the data can get passed to send_data.
I know at least one person is going to feel obliged to jump in here and
tell me that I’m an idiot for trying to store large files in the
database. Please don’t; there are valid reasons for doing this here,
including:
- It’s a hard requirement. An existing J2EE system is able to stream
data from the database by only allocating a buffer; I’m trying to show
that Rails can handle it as well - There will be multiple instances of the application running on
multiple boxes all connecting to the same database, so the database is
the only place we can store data that will be accessible to all
instances - The database is a big honking box with a fiber connected SAN which
will perform much better than than the single IDE drives in the
application server boxes even if we could replicate the files to every
instance
I know that doing this will probably involve some database specific
code, and I don’t mind rolling up my sleeves and writing code, but I’m
still relatively new to Rails, so if anyone could offer any advice or
input about the best way to approach this or what objects and methods I
need to override, I’d really appreciate it.
Thanks much,
Jeff
BTW: My proof-of-concept system is PostgreSQL; the production system (if
and when) will either be Oracle or PostgreSQL. Right now I’m just
interested in PostgreSQL.