Hi,
I was just doing some simple tests using DBI, and I’ve found that the
performance when using MySQL as the database is a lot slower than using
SQL Server.
Here is my setup:
- machine: Pentium 4, 2.80GHz, 1GB RAM
- OS: Windows XP Service Pack 2
- ruby: Ruby Installer for Windows, Ruby Version 1.8.5p12, Installer
Version 185-22 - DBI: 0.1.1
- MySQL: mysql-essential-5.0.27
- SQL Server: SQL Server Express 2005
Both database systems were installed with default options; in each, I
created the database “test” with table as follows:
CREATE TABLE t (id INT NOT NULL, s VARCHAR(50))
The simple test program I used was as follows (for SQL Server):
require ‘dbi’
t0 = Time.new.to_f
db = DBI.connect(‘DBI:ODBC:Driver={SQL Native
Client};Server=(local)\SQLEXPRESS;Database=test;Trusted_Connection=yes’)
0.upto(10000) do |id|
s = “String #{id}”
sql = “INSERT INTO t VALUES (#{id}, ‘#{s}’)”
db.do(sql)
end
db.disconnect
t1 = Time.new.to_f
puts (t1 - t0)
and the only difference for when I tested with MySQL was the line
connecting to the database, as follows:
db = DBI.connect(‘DBI:Mysql:test’, ‘test_user’, ‘test_pwd’)
The results I got were as follows:
(a) SQL Server : 10.xxxxx
(b) MySQL : 197.xxxxx
So now, I’m just curious as to why there is a big difference between the
two? Is it because SQL Server by default installs with some sort of
optimised options whereas MySQL does not? Does it have to do with the
difference in the drivers being used (ODBC vs Mysql)?
(And just for fun, I did a similar test using C# with SQL Server, and
the result I got with that was 9.xxxx).