How to avoid the records that are not in database without reindexing solr

i have a model Ticket which has 4 records… I deleted first 2 records in
the database…

search = Sunspot.new_search(Ticket) do

paginate(:page => 1, :per_page => 2)

end

list = search.execute

list.results

list.results returns [ ] for page 1 and the remaining 2 records for page
2.

Ideally it should have been 1 page with 2 records. is it possible to
avoid
the records that not in database without re-indexing solr?

You could filter them from the array with #compact*, I think, but
ideally, you would want to re-index. That’s the problem with any search
technique that doesn’t hit the live data directly. Yes, solr is much
faster and can do way more tricks, but it’s not “live”. How long does
your index process take? Could it be done after a delete without holding
up the whole system for a ridiculous duration?

Walter

*Actually, compact alone won’t do it, because your results are not in an
array yet. You could cast the results to an array, or you could pass the
current “live” IDs into the search query. No idea how to do that in
solr, but hypothetically, it would be something like this (MySQL) query:

SELECT * FROM tickets AS t1 WHERE [your search here] AND t1.id
IN(SELECT id FROM tickets WHERE 1);

On Wednesday, March 19, 2014 9:52:43 AM UTC, lavanya ramamoorthy wrote:

list.results

list.results returns [ ] for page 1 and the remaining 2 records for page 2.

Ideally it should have been 1 page with 2 records. is it possible to avoid
the records that not in database without re-indexing solr?

why not delete the records from solr? Sunspot::Rails can handle this for
you

Fred