Database best practices?

I started with rails a few weeks ago and I’ve been very impressed with
the whole framework. My first project after the cookbook was a small
application connecting to Postgres. This was originally a port of an
Access application so I was delighted with the new facilities for
constraint checks, triggers, etc.

As I started to write the front-end though I noticed myself rewriting
the same checks in my models so that rails can validate data before it
hits the database. Otherwise the database throws an error instead of
rails having that nice ‘errors’ hash. I began to contemplate models that
use before_save and similar methods (instead of triggers) to log
changes, etc.

So to test things out I’m refactoring again using an sqlite database
(just to keep it simple for now) and keeping all of the code in rails.
I use a migration to setup and populate the tables, and then put all my
checks in the models.

Sofar it’s working swimmingly (it’s a very small app), but I was
wondering what the pro’s/con’s of this setup are.

Performance isn’t an issue with my current project, but for the future
what can I expect? My first crack at the problem involved checking data
twice anyway so this isn’t any slower but maybe I’m going about this the
wrong way? More importantly, what about data integrity?

check out the validation helpers at:

http://rails.rubyonrails.com/classes/ActiveRecord/Validations/ClassMethods.html

“As I started to write the front-end though I noticed myself rewriting
the same checks in my models so that rails can validate data before it
hits the database. Otherwise the database throws an error instead of
rails having that nice ‘errors’ hash.”

Just in case you are making the same mistakes as I was: Defining the
correct
default in database is very important. Rails will use them. For example:
If
you use validate_presence_of :value but in database you set default 0 to
colum value, rails will initialize the :value with 0 and never tell you
when
you leave the field blank in the form.

Okada.

2005/12/15, Anthony [email protected]:

I use 3 levels of error checking:

. simple stuff is checked w/ javascript before it even gets sent back
to the app server
. application server does most error checking
. database does last data-validation check, assuming db supports CHECK
constraints (and possibly triggers)

disadvantages of having only one layer of validation?

. not user-friendly if simple stuff not validated before user sends it
to server
. easier for dirty data to get into database if only one layer of checks

advantages of having only application-level data validation?

. simplicity
. easier to maintain application, at least until the data gets dirty,
then things get complicated
. don’t have to keep 3 separate layers of validation rules in sync

for simple apps, i would just do data validaton on the app server.
for enterprise apps that value data integrity i would do data
validation at each tier if the tier supports it.

Yes I know about the validations and default values, and I use them
extensively.

My models use ‘validates_format_of’, ‘validates_length_of’, etc. along
with custom filters, and my migration sets the default values:

create_table(:static_pages, :force=>true) do |t|
t.column :pagename, :string, :limit=>31
t.column :content, :string, :default=>“asdf”

end

This way I don’t have to do any database setup at all. But my question
is, what are the disadvantages of this style?