I have a fairly complex mysql query which span through ~500000 records
and multiple tables …which take almost 30 secs to run and generates
~5000 rows .
The query has been optimised.
Now … i want to do the foll with this
1> Paginate …I do not want to rerun the query with LIMIT/OFFET
clause etc. The object obtained in the first place.Can it be some how
stored and reused?
2>Sort on different combination of columns …here also I want to use
ruby and do all the functionality.
But how to save the active record object and reuse it .
Basically I want to avoid the 30 second delay for each query.
1> Paginate …I do not want to rerun the query with LIMIT/OFFET
clause etc. The object obtained in the first place.Can it be some how
stored and reused?
2>Sort on different combination of columns …here also I want to use
ruby and do all the functionality.
optimise it some more, use explain select …, make sure you
understand what is going on. Maybe you can structure you data so that
the query isn’t so complicated
Also, that sounds like a very expensive query to leave in the hands of
indiscriminate users. It’s such a costly query it sounds like there
should be another way to do it. Any chance it’s a once-per-day type
job that you could off load to a background process or cron job?
Another possibility would be warehousing the data (e.g.,
ActiveWarehouse, if the project is still active) and supporting
queries only on data that has been warehoused.