A validation question (trying to make a censorship function)

I’m trying to make a rails forum application, and of course, I want to
censor any bad words people may post in my text models. That said, I
wondered if, in Rails 1.2.5, there were any validations method that
could do this easily.

I’m thinking of using “validates_format_of :text, :with => #some
regex, :message => ‘bad word’”, but I don’t remember if ruby has a
negation regex (that is, when it finds a pattern defined in regexp, it
returns false).

Actually, in correspondence to this, is it possible to make a model’s
module? Since several models are going to be sharing the same
validation censor function, I wondered if I can make them all
implement the same module or object (preferably module, since it’s
only a function, and maybe a constant array).

On Dec 30, 2007 3:42 PM, Taro [email protected] wrote:

I’m thinking of using “validates_format_of :text, :with => #some
regex, :message => ‘bad word’”, but I don’t remember if ruby has a
negation regex (that is, when it finds a pattern defined in regexp, it
returns false).

Something like the following in your model:

protected
def validate
errors.add(:some_field, “contains disallowed words”) if
some_field.match /bad word/
end

Break out the condition above into a separate private method and
iterate over a list of bad words or something…
(I’m not sure how expensive the regex operation is, maybe it’s not a
great idea to iterate over a long list. OTOH, maybe you don’t expect
validations to run that often?)

  • Martin

Hey,
there’s a ton of different ways to implement this.

Here’s one idea that (imho) results in reasonably easy-to-read
validation
declarations, keeps the logic in one place, and doesn’t require you to
delve
into creating mixins or custom validation methods:

lib/bad_words.rb

class BadWords
DISALLOWED = [
/bunny/i,
/carrot/i,
/lettuce/i
]

def self.include?(text)
DISALLOWED.any? {|disallowed| disallowed.match(text)}
end
end

app/models/post.rb

class Post < ActiveRecord::Base
validates_exclusion_of :message, :in => BadWords, :message =>
‘contains
bad words’
end

Most examples for validates_exclusion_of (and validates_inclusion_of)
only
show the usage where you supply an array for the :in parameter.

However, you can actually supply any object for :in, as long as the
object
responds to include?() because that’s the method that
validates_exclusion_of
calls when performing the validation.

That’s what I’ve done in the above example. The BadWords class will
respond
to include?() by running through its array of disallowed regular
expressions
and return true if a match is found.

HTH,
Trevor

On 12/30/07, Taro [email protected] wrote:

censor any bad words people may post in my text models. That said, I
wondered if, in Rails 1.2.5, there were any validations method that
could do this easily.

I’m thinking of using “validates_format_of :text, :with => #some
regex, :message => ‘bad word’”, but I don’t remember if ruby has a
negation regex (that is, when it finds a pattern defined in regexp, it
returns false).

Trevor S.
http://somethinglearned.com

Thanks a lot, Trevor and Martin. I like Trevor’s solution for its
versatility, so I’ll use that.

By the way, I noticed you can call objects defined from the lib
directory. Does this apply to ALL ruby programs in rails, or only the
models?

Hey,
rails will autoload from the lib directory. So as long as you follow
rails’
file_name.rb to FileName conventions then you can directly refer to lib
classes/modules from anywhere in your rails app without first having to
require the file.

Trevor

On 12/31/07, Taro [email protected] wrote:

By the way, I noticed you can call objects defined from the lib
directory. Does this apply to ALL ruby programs in rails, or only the
models?

Trevor S.
http://somethinglearned.com