Which is faster for repetition: Ruby or SQL?

Hey all,

I was wondering if there was a general consensus for whether it’s faster
to do repetitive queries solely through Ruby (as on an array), or to
make multiple calls to a SQL database of some sort.

For example, if I have a dictionary, and I know I need to compare an
entire paragraph’s worth of words to the dictionary, should I query each
word against the database? Or load the entire dictionary into a Ruby
object and index against that?

Thanks for your thoughts!
-Jason

On Tue Aug 26 03:44:17 2008, Jason C. wrote:

Hey all,

I was wondering if there was a general consensus for whether it’s faster
to do repetitive queries solely through Ruby (as on an array), or to
make multiple calls to a SQL database of some sort.

Almost certainly Ruby, but be careful about RAM usage if you load the
whole thing into an array. You might want to run some benchmarks[1] on
both methods.

[1] http://www.ruby-doc.org/stdlib/libdoc/benchmark/rdoc/

On Mon, Aug 25, 2008 at 1:44 PM, Jason C. [email protected]
wrote:

Thanks for your thoughts!
-Jason

Depends on your database, how you connect to it (net/local), and the
size/volatility of your paragraph/dictionary. I’d say, split the
words out of the string and build a single SQL query in Ruby (probably
the brunt of it will go in the ‘where’ clause).

I would wager it would be almost infinitely faster than your proposed
options. With my idea, the bottleneck probably would lie at the
connection. Something to think about, anyway.

Todd

Cool. Thanks for your help, all. I’ll give both methods a try for my
application, and see what works best!

Jason C. wrote:

I was wondering if there was a general consensus for whether it’s faster
to do repetitive queries solely through Ruby (as on an array), or to
make multiple calls to a SQL database of some sort.

For example, if I have a dictionary, and I know I need to compare an
entire paragraph’s worth of words to the dictionary, should I query each
word against the database? Or load the entire dictionary into a Ruby
object and index against that?

As Todd says, it will depend on the database, the table structure and
query, and the connection and interface. It will also depend on the
relative importance of start-up time (reading a large dictionary into
Ruby will take some time), memory usage and complexity.

I use SQLite3 as the backend for a fairly complex desktop application.
The time taken for the SQL backend to execute a query is generally
trivial compared to the time to convert the rows into ruby objects. Ruby
objects (in 1.8, less so in 1.9) have significant method-call overhead
which adds up if very many calls need to be made to complete a single
request. SQL engines are specialised and optimised for making queries;
more so if you help by defining the correct INDEXes on TABLEs.

Overall, if performance is an issue, you must make use of benchmark or
similar:

require ‘benchmark’
TIMES = 10_000

start-up

puts Benchmark.measure { TIME.times { load_ruby_dict } }
puts Benchmark.measure { TIME.times { connect_to_sql_dict } }

execute

puts Benchmark.measure { TIMES.times { find_using_ruby } }
puts Benchmark.measure { TIMES.times { find_using_sql } }

http://www.ruby-doc.org/stdlib/libdoc/benchmark/rdoc/index.html

alex

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs