Forum: Ruby Comparing Active Record VS Datamapper...?

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
1bb8cc6d9ef0717c23a5c95986caea82?d=identicon&s=25 Softmind Technology (softmind)
on 2008-02-02 06:54
Hello,

Comparing Active Record VS DataMapper...?

Can any one provide more details on this..

Any Links of Blogs comparing the above two.

I shall also welcome your suggestions on this.

Thanks
391f9b787cdc12aa2c179713f5103e3a?d=identicon&s=25 Ilan Berci (iberci)
on 2008-02-03 00:56
Softmind Technology wrote:
> Hello,
>
> Comparing Active Record VS DataMapper...?
>
> Can any one provide more details on this..
>
> Any Links of Blogs comparing the above two.
>
> I shall also welcome your suggestions on this.
>
> Thanks

I was curious about this myself, I have used AR for years now and I was
interested in DM's promise of no migrations and optional lazy loading of
fields.  I have no problem expressing my fields in the models (I
actually like it) and I have no problem being committed to MySQL. If DM
can update your schema automagically when seeing an
incompatibility/update with your model then I won't sleep for a week due
to the magic of it all.

I used to think that migrations were the best thing since sliced bread
but due to the fact that AR attempts to cater to every major vendor, I
usually  have to go in a hand tune my migrations anyways (bigint and
foreign key constraints which I realize many feel are unnecessary when
working with RoR).  I also used to follow OG as well but after some
playing around, i came to the conclusion that updating the schema with
it was far more difficult than it was with AR.

Merb right now is a big carrot for me and it has direct support (through
gem dloads) for haml AND DataMapper so I think something truly great (or
at least very exciting) is in the mix!  Due to the fact that I am a
coward however, I still use AR at work as I am very comfortable with it
and I finally got used to a lot of it's black magic through
method_missing craziness.

Merb/DataMapper is a cinch to dload and set up so you should see it for
yourself..
ilan
Ef0db53920b243d6758c2f6b1306df0d?d=identicon&s=25 Steve Ross (cwd)
on 2008-02-03 01:50
(Received via mailing list)
On 2/2/08 3:56 PM, "Ilan Berci" <coder68@yahoo.com> wrote:

>>
> I used to think that migrations were the best thing since sliced bread
> coward however, I still use AR at work as I am very comfortable with it
> and I finally got used to a lot of it's black magic through
> method_missing craziness.
>
> Merb/DataMapper is a cinch to dload and set up so you should see it for
> yourself..
> ilan
>
>
I'm no DM expert, but from my perspective the benefits are:

- Really modular: Add-ons installed as gems (AR is kind of moving this
way)
- Thread safe. 'Nuff said.
- Works in Rails, Merb, and stand-alone Ruby programs

Now the magic migrations (ala Og) have been a subject of some debate
because
They toast the table when they happen. As I understand it, adding a
country
field to an address table would drop the table and recreate it. This can
be
a particularly annoying side effect when dealing with tables that have
related tables, or worse, habtm.

There are claims that DM is faster than AR, but I have yet to
demonstrate
that on any of my sample data. They seem pretty much on a par.

Download it and stick it in one of your play apps to see what happens.
Ff9e18f0699bf079f1fc91c8d4506438?d=identicon&s=25 James Britt (Guest)
on 2008-02-03 05:39
(Received via mailing list)
s.ross wrote:

> Now the magic migrations (ala Og) have been a subject of some debate because
> They toast the table when they happen. As I understand it, adding a country
> field to an address table would drop the table and recreate it. This can be
> a particularly annoying side effect when dealing with tables that have
> related tables, or worse, habtm.

Do you know this to be true of Og, or just of DataMapper?

I've been quite happy using Og for the last few years, though I may not
have yet done anything especially tricky that would bork it.


--
James Britt

www.ruby-doc.org             - Ruby Help & Documentation
www.risingtidesoftware.com   - Wicked Cool Coding
www.rubystuff.com            - The Ruby Store for Ruby Stuff
www.jamesbritt.com           - Playing with Better Toys
1bb8cc6d9ef0717c23a5c95986caea82?d=identicon&s=25 Softmind Technology (softmind)
on 2008-02-03 08:16
>
> Do you know this to be true of Og, or just of DataMapper?
>
> I've been quite happy using Og for the last few years, though I may not
> have yet done anything especially tricky that would bork it.
---------------------------------------------------------------

Can someone pl.guide me what's this thing called " Og ".

I have never heard of this before....?

Thanks
6d3c187a8b3ef53b08e3e7e8572c4fea?d=identicon&s=25 Jeremy McAnally (Guest)
on 2008-02-03 08:23
(Received via mailing list)
It's another ORM layer like DataMapper/ActiveRecord.  It's part of the
Nitro project, which you can find at http://www.nitroproject.org/

--Jeremy

On Feb 3, 2008 2:16 AM, Softmind Technology
<softmindtechnology@gmail.com> wrote:
> I have never heard of this before....?
>
> Thanks
>
> --
> Posted via http://www.ruby-forum.com/.
>
>



--
http://www.jeremymcanally.com/

My books:
Ruby in Practice
http://www.manning.com/mcanally/

My free Ruby e-book
http://www.humblelittlerubybook.com/

My blogs:
http://www.mrneighborly.com/
http://www.rubyinpractice.com/
Ef0db53920b243d6758c2f6b1306df0d?d=identicon&s=25 Steve Ross (cwd)
on 2008-02-03 08:46
(Received via mailing list)
On Feb 2, 2008, at 8:39 PM, James Britt wrote:

>> related tables, or worse, habtm.
>
> Do you know this to be true of Og, or just of DataMapper?
>
> I've been quite happy using Og for the last few years, though I may
> not have yet done anything especially tricky that would bork it.

Sorry. Big clarification. DM does the thing with destroying the data,
and the discussion occurred on the DM mailing list under the title
"Crazy Migrations." The similarity to Og is in the notion of defining
properties in your model class and having them appear as columns in
your database.

 From my perspective, writing DDL is the most certain way to get what
you want. But it's a mental context switch that not everyone wants to
make regularly. AR migrations move from database-specific DDL to a
more database-agnostic DSL, which is cool. Cooler is the promise of
up- and down-migrations. But it turns out there are some migrations
that can't really be reversed, such as removing columns. You can add
the columns back in but the data can't easily be reconstructed.
However, AR migrations let you mix and match database-specific stuff
and also do initialization or whatever other munging you like. DM and
Og have been without migrations as far as I know. Some have used these
tools and not missed the migrations. I would find the lack of
migrations a deal-breaker at this point.

Note: There are migrations being worked on in DM-land.

--steve
771d59fe64b8da6deae5fbd2492806db?d=identicon&s=25 S. Potter (mbbx6spp)
on 2008-02-09 19:13
> Steve wrote:
> From my perspective, writing DDL is the most certain way to get what
> you want. But it's a mental context switch that not everyone wants to
> make regularly.
I have never had a problem getting AR migrations to produce the DDL I
need.  For database specific options you can define the :options
parameter to pass into the relevant migration method (e.g. create_table,
etc.).  Also standard parameters like :limit, :null, :primary, etc. are
available and aren't used enough in my experience by Rails developers.
In addition to this I have internally developed AR migration extensions
where I can simplify things in a more convenient way just by extending
the relevant AR adapters and in the process reduce human error
considerably and thus produce a more consistent DB schema:
---
# pastie: http://pastie.caboo.se/private/2jrjnodsjsxxgi4stbjmg
  class << self # yes I use this Ruby idiom, perhaps a bit too much:)
    up
      create_lookup_table :countries
      # OR
      add_foreign_key :users, :country_id #, :table =>
:not_obvious_table_name
      # OR
    end

    down
      drop_lookup_table :countries
      # OR
      remove_foreign_key :users, :country_id #, :table =>
:not_obvious_table_name
    end
  end
---
In the above example "create_lookup_table" creates a specific type of
table that usually has an :id, :code (which is unique), :name, where
:code has a unique index declared on it.  This is just a small sample of
extensions I have internally developed for specific needs, but the point
is that customization of AR migrations is fairly painless for a Ruby
developer, which yields a much lower incidence of human error and schema
inconsistencies than directly writing DDL and streamlines the process of
creating consistent schemas with little mindset switching.

> Steve also wrote:
> But it turns out there are some migrations that can't really be
> reversed, such as removing columns. You can add the columns back
> in but the data can't easily be reconstructed.
Yes, but this is why AR migrations allow you to declare in the down
migration that it is an irreversible migration by raising an
IrreversibleMigration exception, if you feel that is necessary for
project sanity.

Now on the general topic of DataMapper vs. ActiveRecord.  First let us
consider the original idea behind both.  Both of these Ruby libraries
are based on Fowler's enterprise architecture patterns of the same name.
However, the Ruby library DataMapper doesn't appear to be written in
exactly the same vain as Fowler's pattern.  The Ruby DataMapper library
promotes the use of the DataMapped objects *as* Domain Objects, which is
different to Fowler's pattern recommendation that is to use a Data
Mapper to separate the in-memory domain object from it's database
representation. ActiveRecord (the Ruby library) is consistent with
Fowler's definition of Active Record (the PofEA), except that Ruby
metaprogramming is able to hide so much of the database lookup code that
the design pattern's documented disadvantage (of not being useful for
complex domain object logic) doesn't apply as much.  I feel this
difference is much overlooked and needs stating.

I have only used AR (Ruby) on production projects and only played with
DM (Ruby) on personal projects, so I am not able to give an in-depth
comparison of real world usage or performance.  However, in theory a
library that follows Active Record pattern principles would be more
beneficial to use when tables more directly correlate to objects (i.e.
an isomorphic schema).  When mapping table attributes into many object
instances you may find libraries using a Data Mapper pattern less
complex and more expressive.  However, the primary advantage of Fowler's
Data Mapper pattern requires that you separate the Data Mapper class
from the Domain Object class.  Of course this is complicated by the fact
that in Ruby orthogonal features can be added using metaprogramming,
which wasn't considered when Fowler wrote the PoEA book as he was was
heavily into non-metaprogramming-capable static languages at the time.

It appears that if DataMapper (the Ruby library) is able to sufficiently
hide enough database logic in more complex business logic scenarios,
then the DM library might be more beneficial to use when using a legacy
database schema where you are not able to create primarily isomorphic
relationships between class attributes and table columns, whereas AR
would be a slam dunk and simpler to use in isomorphic schema scenarios.

HTH,
Susan
---
@blog: http://snakesgemscoffee.susanpotter.net
1bac2e65d64faf472cf2ebc94f0f5ee0?d=identicon&s=25 ara howard (Guest)
on 2008-02-09 21:48
(Received via mailing list)
On Feb 9, 2008, at 11:13 AM, S. Potter wrote:

> I have never had a problem getting AR migrations to produce the DDL I
> need.



try defining a column with a default of 'now()' or anything which uses
db functions.  note that you cannot use Time.now from inside ruby
because, unlike internal db functions, this will not ensure a
consistent now across all fields inside a transaction.

cheers

a @ http://codeforpeople.com/
6d21f1858dda3716c76cd2a7f92ca08f?d=identicon&s=25 Sharon Rosner (ciconia)
on 2008-02-09 22:03
(Received via mailing list)
> try defining a column with a default of 'now()' or anything which uses  
> db functions.  note that you cannot use Time.now from inside ruby  
> because, unlike internal db functions, this will not ensure a  
> consistent now across all fields inside a transaction.

You can easily do that in Sequel:

  DB.create_table :posts do
    ...
    timestamp :stamp, :default => :now[]
  end

Sorry, couldn't resist :-)
sharon
1bac2e65d64faf472cf2ebc94f0f5ee0?d=identicon&s=25 ara howard (Guest)
on 2008-02-10 16:56
(Received via mailing list)
On Feb 9, 2008, at 2:02 PM, Sharon Rosner wrote:

>    timestamp :stamp, :default => :now[]

can you also do

   timestamp :stamp, :default => literal('now()')

or something that effect?

i recall something like that...

a @ http://drawohara.com/
771d59fe64b8da6deae5fbd2492806db?d=identicon&s=25 S. Potter (mbbx6spp)
on 2008-02-12 18:13
howard wrote:
>> try defining a column with a default of 'now()' or anything which uses �
>> db functions. �note that you cannot use Time.now from inside ruby �
>> because, unlike internal db functions, this will not ensure a �
>> consistent now across all fields inside a transaction.

Actually for Ruby programmers as opposed to people that only know APIs
in Ruby, extending the APIs for specific use-cases is very painless.  To
get people started to solve this though (as I do not have permission to
post the whole code in public yet) you would override the relevant
connection adapters (e.g.
ActiveRecord::ConnectionAdapters::PostgreSQLAdapter, etc.) and redefine
the #column method to accept (in our case) :default_sql, which wouldn't
escape the SQL string passed to it.  It is good practice to override all
standard connection adapters even if you do not use them, because you
never know who might start using SQLite3 on the team in development or
test environments.  Especially if you roll these AR extensions out on an
IT-wide or cross-project basis.

If I were you I would look at the sexy migrations plugin code for more
pointers if you don't want to look inside AR source.

> You can easily do that in Sequel:
>   DB.create_table :posts do
>     ...
>     timestamp :stamp, :default => :now[]
>   end
>
> Sorry, couldn't resist :-)
> sharon

The problem with Sequel as far as I understand it, is that you have to
be disciplined enough to create the Domain Object layer yourself to
abstract away more complex model logic and it appears to be a lot more
code to hide the Sequel code for non-trivial scenarios.  Otherwise you
get a very stripped down DB-oriented (rather than object-oriented)
database adaption API for your application.

If you do not create a Domain Object layer, you end up with all the
logic (controller AND model logic) in the controller layer, which is
only maintainable when you have very trivial model logic.  This doesn't
work in richer model logic environments.

Both AR and DM can cope well in rich Domain Object scenarios though
which one of these is the better fit depends on other variables.  So
comparing Sequel to AR and/or DM is like comparing apples and oranges.
If you like oranges better, that is fine, but let's not lose sight of
this fact that Sequel is not really in side by side competition with
these two libraries.

On a positive note for Sequel, I have heard that for non-complex
mappings it is lighterweight, but I have not seen any real benchmark
numbers to support this.  However, intuitively it seems plausible as it
doesn't hide very much of the database, which in most of my development
work is not a benefit.

If you are looking for a ORM library in Ruby that has everything all
there, you will be disappointed.  As you can see there are pros and cons
for each library and will depend on the following (depending on how your
team weighs each one):
* Maturity
* Schema (e.g. legacy non-isomorphic schema, etc.)
* Migration Needs (i.e. for limited support DM might be fine, for data
migration support AR or Sequel might fit the bill)
* Design (e.g. are you looking for a library that provides you with a
Domain Object to hide non-trivial model logic inside, if so DM or AR is
for you.  If not Sequel might work)
* Skillset (e.g. if you have more AR knowledge, then you might be better
off going this route, etc.)
* Performance (although I haven't seen any benchmarks out there on this
one yet?)

There is no silver bullet and anyone that tells you there is, either has
a very simple situation that doesn't test all these variables (and
possibly more variables I haven't mentioned?) or is biased!:)

Thanks,
Susan
PS I accept that most of my experiences are with AR and that is what I
know best, but I also realize there are AR limitations.  However, I
choose to solve those limitations myself for
project/architecture-specific needs.
6d21f1858dda3716c76cd2a7f92ca08f?d=identicon&s=25 Sharon Rosner (ciconia)
on 2008-02-12 20:17
(Received via mailing list)
> work in richer model logic environments.
Not true. Sequel includes a model class implementation based on the
ActiveRecord pattern with features such as: composite primary keys,
relationships, caching using memcached, validations and callbacks,
among others.

sharon
1bac2e65d64faf472cf2ebc94f0f5ee0?d=identicon&s=25 ara howard (Guest)
on 2008-02-12 23:03
(Received via mailing list)
On Feb 12, 2008, at 10:13 AM, S. Potter wrote:

> wouldn't
> escape the SQL string passed to it.  It is good practice to override
> all
> standard connection adapters even if you do not use them, because you
> never know who might start using SQLite3 on the team in development or
> test environments.  Especially if you roll these AR extensions out
> on an
> IT-wide or cross-project basis.


yes of course, i've done it several times, here's the whole code:

   http://drawohara.com/post/6677354

this is *way* beyond the abilities of most programmers, however, ruby
or not, and is a definite limitation of AR, which binds the hands of
people who know sql at even a moderate level.

regards

a @ http://drawohara.com/
41c597a48c80e37ba68d1adc7095ea0e?d=identicon&s=25 Sam Smoot (Guest)
on 2008-02-20 02:56
(Received via mailing list)
On Feb 12, 11:13 am, "S. Potter" <mbbx6...@tautology.net> wrote:
> Actually for Ruby programmers as opposed to people that only know APIs
> in Ruby, extending the APIs for specific use-cases is very painless.

Ouch Susan. :-)

One of the reasons I've started DataMapper was because working with AR
to make multi-lingual extensions, and handling UTF-8 to UTF-16LE
conversion transparently (for MSSQL support) in 0.9~1.2 was far far
from trivial.

Not that smart people didn't eventually figure out how to do these
things, but to say it's painless makes me suspect you're either just a
whole lot smarter than me, very intimate with the internal workings of
AR, or you're talking about more recent efforts as AR has matured.

DM has much less magic. It's much easier to tweak (IMO). But then
again, I am intimately familiar with it, so it's hard to keep an
objective eye in this regard... so maybe I should say "I think/hope
it's much easier to tweak".

Anyways, overall I would say your evaluation of DM is generally fair,
if not too accurate architecture wise. DM right now is actually a
pretty classical Data Mapper according to the designs of the PoEAA.
Some of the names are a bit off, but DM doesn't deviate significantly
more or less than many of the other O/RMs implementing the pattern out
there. The extensions made to the models are there for convenience of
the user for the most part. The exceptions are the #key method, and
the #database_context method. Every DataMapper is going to have a
parallel for the latter (#database_context), but I concede the former
(#key) is not as common. I was just being lazy I suppose. ;-)

On the other hand, the architecture you describe for DM is almost spot-
on for where we'd like to take it in the future. I've accepted that
some of the unique challenges in working with Ruby are leading me to
design decisions I might not otherwise have made... Basically, I'm
trying to explore ways to make DM "flatter"/faster, without
sacrificing design too much. One of the ways we plan to accomplish
that is by making the models themselves the center of the mapping
"star" for DM. It's not the sort of PORO purity I imagined before, but
it's a reasonable compromise I think, and there's really no downside
if I'm pragmatic about it.

I've gotta take exception with the "slam dunk"/simpler comment
regarding AR. You've got to define the attributes somehow. In DM you
do that in the model, in AR you write a migration or do it in your
database. Either way you've got the same tasks. At the end I think
DM's simplified finders win the day for me though. It's just painful
to go back to Thing.find(:first, :conditions => ["name = ?",
'something']) when I could be writing Thing.first(:name =>
'something').

But I'm biased. :-D

I do think AR wins big in the setup department just because it
provides pure-ruby drivers for many databases. So you can be running
ASAP. We hope to improve on our currently poor platform support in the
next few weeks with the release of DataObjects 0.9.0. (We're about
half-way there right now.)
This topic is locked and can not be replied to.