PostgreSQL

Hi @ all

Does anyone has some experience with PostgreSQL and RoR? Does it gives
any disadvantages or problems with PostgreSQL? I have a webapplication,
that manage (read/write) many records (no. 200’000 records) and I search
a ideal database. Actually, I have two options: - MySQL / - PostgreSQL
And now, I would like to ascertain a good solution…

Thx, ribit

I use postgres for all of my rails apps and I love it. In particular
for
searching tsearch2 gives you tremendous flexibility - like building
indexes
on fields from multiple tables and giving you the results based on rank
etc…

People say it’s slower than mysql, but I’ve never had any noticeable
performance problems (with the right indexes and regular vacuuming you
can
boost postgres’s performance a lot).

One of my apps searches a tsearch index of 75,000 records and return the
first page of the resultset with 4 joins in under 1 seconds and I know
of a
few areas where I could improve this more.

www.jefdean.com
[email protected]
917 414 7801

Thomas Muster wrote:

Hi @ all

Does anyone has some experience with PostgreSQL and RoR? Does it gives
any disadvantages or problems with PostgreSQL? I have a webapplication,
that manage (read/write) many records (no. 200’000 records) and I search
a ideal database. Actually, I have two options: - MySQL / - PostgreSQL
And now, I would like to ascertain a good solution…

Thx, ribit

I’ve shipped several commercial Rails apps on top of Postgres. It’s
perfectly fine. Keep an eye on the size of the data files and make sure
you run VACUUM at least once a day (more often if you do a lot of
UPDATEs or DELETEs). I geenrally prefer Postgres for OLTP apps and MySQL
for OLAP. In your case, the data size is so small you probably won’t
notice a difference.

I’ve shipped several commercial Rails apps on top of Postgres. It’s
perfectly fine. Keep an eye on the size of the data files and make sure
you run VACUUM at least once a day (more often if you do a lot of
UPDATEs or DELETEs). I geenrally prefer Postgres for OLTP apps and MySQL
for OLAP. In your case, the data size is so small you probably won’t
notice a difference.

It’s correct, the data size is small (200’000 records). But an other
application is running as a concurrent activity (UPDATE records). So I
have many transactions (about 10’000 per day). Do you think, that
PostgreSQL is the better solution for me with these transaction? Can I
automate the VACUUM because I haven’t time every day to start the
VACUUM?

It’s correct, the data size is small (200’000 records). But an other
application is running as a concurrent activity (UPDATE records). So I
have many transactions (about 10’000 per day). Do you think, that
PostgreSQL is the better solution for me with these transaction? Can I
automate the VACUUM because I haven’t time every day to start the
VACUUM?

Yeah, the later versions of Postgres ship with autovacuum builtin and
enabled, which works well for most applications. Postgres uses an
MVCC architecture, which basically means that an update is actually an
insert/mark-as-deleted, and a vacuum is needed periodically to clean
out all the “dead” rows. You will want to tune the autovacuum
settings to make sure that you are cleaning up often enough and that
the number of “dead” rows doesn’t get too large.

In general (don’t want to start a flame war), but MySQL tends to be
faster for the simple queries with few connections, while Postgres
tends to do better with complicated joins and many connections.

FWIW, I love Postgres and have had nothing but ease building apps on top
of it.

Marco R. wrote:

I’ve shipped several commercial Rails apps on top of Postgres. It’s
perfectly fine. Keep an eye on the size of the data files and make sure
you run VACUUM at least once a day (more often if you do a lot of
UPDATEs or DELETEs). I geenrally prefer Postgres for OLTP apps and MySQL
for OLAP. In your case, the data size is so small you probably won’t
notice a difference.

It’s correct, the data size is small (200’000 records). But an other
application is running as a concurrent activity (UPDATE records). So I
have many transactions (about 10’000 per day). Do you think, that
PostgreSQL is the better solution for me with these transaction? Can I
automate the VACUUM because I haven’t time every day to start the
VACUUM?

10,000 transactions a day is a pretty light load. You can invoke VACUUM
from a nightly cron job and you’ll be fine. As I suggested, just watch
the disk consumption of your Postgres data files. Proper vacuuming will
keep it in bounds.

I find that MySQL performance is fine when queries dominate the
workload. If you are doing a lot of INSERTs and UPDATEs, especially with
tables that have a lot of indexes, I would prefer Postgres.