Speeed of generating a PDF


I have been using the PDF::Writer to generate pdf for reports for our
system and they have worked fine but speed is becoming an issue for us
Since we can have like 1000 of records with about 20-30 columns to be
rendered to the table, the time that it takes to generate those
reports is very high and sometimes the mysql times out.

Can anyone suggest any way of either generating the PDF or of reducing
the time taken to generate PDF using the PDF ::Writer and


It sounds like this is the sort of thing that could be handled in the
background or during times when server usage is low. How does mysql
relate to the generation of the PDFs? Why can’t you just read a ton of
records into memory and work from there?

On 08/02/2008, [email protected] [email protected]

Can anyone suggest any way of either generating the PDF or of reducing
the time taken to generate PDF using the PDF ::Writer and

Maybe try to set split_rows to true (default is false).
This should lower the number of the transactions used to render the


Well the data is coming from a database table but for each data
returned we have to do some ruby programming to return us some more
information for that data.
Hence everything is not from a database table but has dynamic content
also attached to it.

On Feb 8, 12:58 pm, Dave S. [email protected]

I have the same issue - generating 300+ PDF’s that are 1-6 pages long

  • and it gets really slow. What I’ve read has said the table
    generation is the slowest part of PDF::Writer - so I tested
    eliminating tables and reducing the amount of tables in each document
    and I got much faster generation times (like it was taking 30+ seconds
    per each PDF with lots of tables and now it takes just a few seconds
    with only one table)

Hope that helps

On Feb 8, 12:45 pm, “[email protected]


In my view i am just using one table and i tried setting
split_rows=true but that didnt do any help.
Also i didnt understand much from the first point. Can u elaborate it
a little more.



Two recommendations. The first becomes obvious if you think of the
PDF as a “view” rather than a document/report. Since you’re dealing
with a view, consider paging through the data rather than reading all
1K+ records at once. This might be “chattier” than you’d like but
reducing the memory footprint will be a help to the other users of
your system and it’ll greatly reduce the chances of losing your MySQL
connection. In a similar situation, I set a query size of 100
records, then calculated the number of “pages” that I needed to read
from the DB and iterated through the records until I was done.
Suddenly all the memory errors went away. :slight_smile:

The second recommendation has to do with PDF::Writer itself. As I
have been able to determine, the greatest part of the slowdown in
PDF:Writer is with table generation. PDF:Writer uses some very time
consuming calculations to determine if a set of related rows should
be “split” or not. I was able to greatly increase the rendering speed
by passing :split_rows=>true to the table generator. This turns off
the expensive calculation and greatly increases the rendering speed.
I think this change helped increase the rendering time of my table
(~1400 rows) by two orders of magnitude. In our specific situation we
were creating one table for every record so we determined the number
of records we could render on each page and then handled the paging
issues on our own. It was a little extra code but the speed up was
well worth it.


On Feb 8, 12:48 pm, “[email protected]