ActiveRecord hang with MySQL and large data

I can consistently reproduce a hang in the following place.

–> #0 /usr/lib/ruby/gems/1.8/gems/ruby-debug-base-0.9.3/lib/ruby-
debug-base.rb:74 in ‘interrupt_last’
#1 /usr/lib/ruby/gems/1.8/gems/ruby- debug-0.9.3/bin/rdebug:100
#2 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/vendor/mysql.rb:1094 in ‘read’
#3 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/vendor/mysql.rb:514 in ‘read’
#4 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/vendor/mysql.rb:396 in ‘read_query_result’
#5 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/vendor/mysql.rb:194 in ‘real_query’
#6 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/vendor/mysql.rb:322 in ‘query’
#7 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/connection_adapters/mysql_adapter.rb:243 in ‘execute’
#8 /usr/lib/ruby/1.8/benchmark.rb:307 in ‘realtime’
#9 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/connection_adapters/abstract_adapter.rb:110 in ‘log’
#10 /usr/lib/ruby/gems/1.8/gems/activerecord- 1.15.3/lib/
active_record/connection_adapters/mysql_adapter.rb:243 in ‘execute’
#11 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/connection_adapters/mysql_adapter.rb:275 in
‘rollback_db_transaction’
#12 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/connection_adapters/abstract/database_statements.rb:64
in ‘transaction’
#13 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/transactions.rb:95 in ‘[]’
#14 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/transactions.rb:121 in ‘transaction’
#15 /usr/lib/ruby/gems/1.8/gems/activerecord-1.15.3/lib/
active_record/transactions.rb:133 in ‘save!’
#16 (eval):6
#17 /usr/lib/ruby/gems/1.8/gems/rails-1.2.3/lib/commands/runner.rb:
45 in ‘read’
#18 /usr/local/lib/site_ruby/1.8/rubygems/custom_require.rb:27 in
‘require’
#19 /tmp/test/script/runner:3

Here is a way to test it.

cd /tmp
rails test
cd test
script/generate model example data:binary
sed -i ‘s/:binary/:binary, :limit=>128.megabyte/’ db/migrate/
001_create_examples.rb
echo “create database test_development;” | mysql -u root
rake db:migrate
echo -e ‘data = “”\n25.megabyte.times { data << " " }\nexample =
Example.new\nexample.data = data\nputs “save”\nexample.save!’ > break
rdebug /tmp/test/script/runner break

Type “c” at the rdb prompt to start the script. If you don’t have
rdebug, run “gem install ruby-debug”.
Give the script about 10-15 seconds more after it prints “save”.
Hit Ctrl+C.
Type “w” at the rdb prompt to see a stack trace. It should be similar
to the above trace.

To clean up:

echo “drop database test_development;” | mysql -u root
cd …
rm -rf test

I don’t understand why it hangs. I also don’t understand why it is
trying to rollback the transaction. As evidenced by the migration, we
set the limit to 128 megabytes. We are trying to add 25 megabytes of
data here. Why should it hang?

Here is some more information.
$ uname -a
Linux gicodex2 2.6.20-16-generic #2 SMP Thu Jun 7 20:19:32 UTC 2007
i686 GNU/Linux
$ cat /etc/issue.net
Ubuntu 7.04
$ rails --version
Rails 1.2.3
$ mysql --version
mysql Ver 14.12 Distrib 5.0.38, for pc-linux-gnu (i486) using
readline 5.2

I would appreciate any information on how to prevent hangs on large
blocks of data like this.

Thank you,
Rusty B.

On Jul 23, 6:41 pm, Rusty B. [email protected] wrote:

echo -e ‘data = “”\n25.megabyte.times { data << " " }\nexample =
Example.new\nexample.data = data\nputs “save”\nexample.save!’ > break
rdebug /tmp/test/script/runner break

Type “c” at the rdb prompt to start the script. If you don’t have
rdebug, run “gem install ruby-debug”.
Give the script about 10-15 seconds more after it prints “save”.
Hit Ctrl+C.
Type “w” at the rdb prompt to see a stack trace. It should be similar
to the above trace.

Has anyone else been able to reproduce this?

~Rusty

On Jul 24, 1:22 pm, Rusty B. [email protected] wrote:

Has anyone else been able to reproduce this?

It hangs on the same line when using edge rails (revision 7235). :frowning:

It does not hang with PostgreSQL as a backend.

I have filed a bug here:
http://dev.rubyonrails.org/ticket/9087

~Rusty

This might be different issue, however I have experienced problems
with MS SQL in the past trying to add to much data at once (50 megs
plus). It seemed to only handle a certain amount of data at once. The
25 megs you are sending could be to much for Mysql. Look into the
Mysql docs for that information. It would also explain why postgres
works and mysql doesn’t

On Jul 24, 11:38 pm, Camo [email protected] wrote:

This might be different issue, however I have experienced problems
with MS SQL in the past trying to add to much data at once (50 megs
plus). It seemed to only handle a certain amount of data at once. The
25 megs you are sending could be to much for Mysql. Look into the
Mysql docs for that information. It would also explain why postgres
works and mysql doesn’t

After some further testing, it breaks at around 8MB of data. Playing
with sizes around 8388500 bytes (7.999MB), it sometimes generates a
ActiveRecord::StatementInvalid exception that even crashes rdebug on
occasion.

The real exception is hidden behind 8MB of SQL data:
/tmp/test/vendor/rails/activerecord/lib/active_record/
connection_adapters/abstract_adapter.rb:135:in `log’: Mysql::Error:
#08S01Got a packet bigger than ‘max_allowed_packet’ bytes

My max_allowed_packet size is set to 16MB. Looking at the SQL data
though, the spaces are sent as ‘2020202020’… thus doubling the
effective size. Why doesn’t ActiveRecord split the data into multiple
packets? I highly doubt setting the max_allowed_packet too large is a
good idea.

Even if ActiveRecord is designed to be so limited, it shouldn’t hang.

~Rusty

On Jul 25, 9:17 pm, Rusty B. [email protected] wrote:

ActiveRecord still shouldn’t hang when a packet is too large.

In addition, increasing the server limit seems to have no effect.
Does anyone know where the client settings for the mysql driver can be
set?

The fix is all too simple. Install the native MySQL driver. This
fixes the hang and the restriction on packet size.

Here is a link to an old bug.
http://dev.rubyonrails.org/ticket/1189

I don’t know the name of the native gem so I just installed the
package for my OS.
apt-get install libmysql-ruby

~Rusty

On Jul 25, 2:11 am, Rusty B. [email protected] wrote:

My max_allowed_packet size is set to 16MB. Looking at the SQL data
though, the spaces are sent as ‘2020202020’… thus doubling the
effective size. Why doesn’t ActiveRecord split the data into multiple
packets? I highly doubt setting the max_allowed_packet too large is a
good idea.

Even if ActiveRecord is designed to be so limited, it shouldn’t hang.

I guess it is MySQL that has the single-packet per query limitation.
http://dev.mysql.com/doc/refman/5.0/en/packet-too-large.html

It says the upper limit on packet size is 1GB. So the maximum BLOB
size is about 512MB. I guess the 4GB size of the LONGBLOB is simply
for show…

ActiveRecord still shouldn’t hang when a packet is too large.

In addition, increasing the server limit seems to have no effect.
Does anyone know where the client settings for the mysql driver can be
set?

~Rusty