-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
eskim wrote:
I decide to do it by making temporary files and then sending it to the
clients by using the function ‘render_file’.
This seems to be reasonable and take a little time but I can stand it
hopefully the clients, either 
Do you guys agree on this?
So you know, there is a side-effect with the mysql/ruby bindings which
causes mass memory utilization to occur if you are going to iterate over
thousands and thousands of mysql results. This happens with “fetch_hash”
is called (because that is what turns all of your C strings to ruby
Strings).
I would recommend that you evaluate if you should NOT process this many
results in one of your dispatchers but use another process that you can
start/stop/kill outside of your rails code. Your dispatchers will grow
huge (unless you don’t mind them getting killed).
Also if you constantly process large sets of different data (say you
have 10,000,000 rows and you are constantly process different sets of
that 200,000 at a time) you are NOT going to maintain consistent memory
utilization just because you’re consistently doing 200,000 records at a
time. Instead memory utilization will go up each iteration. Although it
will eventually taper off. I posted on this back earlier this year to
ruby-core. Here were some of my results:
http://blade.nagaokaut.ac.jp/cgi-bin/scat.rb/ruby/ruby-core/7463
Given the amount of memory you have allotted for you this may not be a
problem for you.
Be sure to benchmark and profile when you are working with this much
data. Ruby and Rails both make things so easy for developers, and it is
easy to fall into the trap of doing things the same when dealing with 10
records or with 100,000 records. The difference is that 100,000 records
will have a larger impact on resource utilization and you need to be
aware of how your code works with that much data being processed. It
will save you headaches in the future if someday your system runs out of
physical memory and results to using too much swap, etc.
You will also NOT get away with logic bugs or poor algorithms that
manipulate that data with large sets of data, as you might with a small
set.
You may already be aware of this, if so, keep on trucking…
Zach
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFE8lrIMyx0fW1d8G0RAmTtAJ9sLDoRBpNmgP0kW65ogK1aXP28iwCfQE7t
4D2eL3cZUTIBAxqubWriWHE=
=Rq1c
-----END PGP SIGNATURE-----