Strategy for dealing with large db tables

I’m looking at building a Rails application which will have some pretty
large tables with upwards of 500 million rows. To keep things snappy
I’m currently looking into how a large table can be split to more
manageable chunks. I see that as of MySQL 5.1 there is a partitioning
option and that’s a possible option but I don’t like the way the column
that determines the partitioning has to be part of the primary key on
the table.

What I’d really like to do is split the table that a AR model writes to
based upon the values written but as far as I am aware there is no way
to do this - does anyone have any suggestions as to how I might
implement this or any alternative strategies?

Thanks

Arfon

Arfon - you might check this out:

It’s somewhat dated and deals with postgresql but

" We currently have about 100GB of data and will soon grow to a multi-
terabyte system. We have tables of up to 1 billion rows and have been
able to get ~1 million row queries to run in about 5 min. "

Rick

On Nov 27, 5:15 am, Arfon S. [email protected]