Consider
N_X = 20 # size of pseudo array
class CreatePeople < ActiveRecord::Migration
create_table :people do |t|
t.string :first_name , :null => false
.
.
.
t.integer :x00
t.integer :x01
t.integer :x02
.
.
.
t.integer :x18
t.integer :x19
t.timestamps
end
end
What is the (best) way of generating the “t.integer :xYY” (where YY is
00 through 19, inclusive) which is dependent on N_X.
On 22.11.2009 15:06, Ralph S. wrote:
.
t.timestamps
end
end
What is the (best) way of generating the “t.integer :xYY” (where YY is
00 through 19, inclusive) which is dependent on N_X.
What stops you from doing
N_X = 20 # size of pseudo array
class CreatePeople < ActiveRecord::Migration
create_table :people do |t|
t.string :first_name , :null => false
.
.
.
N_X.times do |i|
t.integer(("x%02d" % i).to_sym)
end
t.timestamps
end
end
? Btw, having columns with indexes in their name is considered bad
schema design because you either could normalize it into another table
and join or provide more meaningful names instead of “x02” if it is used
as a placeholder. My 0.02EUR.
Kind regards
robert
RK> On 22.11.2009 15:06, Ralph S. wrote:
.
t.timestamps
end
end
What is the (best) way of generating the “t.integer :xYY” (where YY is
00 through 19, inclusive) which is dependent on N_X.
RK> What stops you from doing
RK> N_X = 20 # size of pseudo array
RK> class CreatePeople < ActiveRecord::Migration
RK> create_table :people do |t|
RK> t.string :first_name , :null => false
RK> .
RK> .
RK> .
RK> N_X.times do |i|
RK> t.integer((“x%02d” % i).to_sym)
RK> end
RK> t.timestamps
RK> end
RK> end
RK> ? Btw, having columns with indexes in their name is considered bad
RK> schema design because you either could normalize it into another
table
RK> and join or provide more meaningful names instead of “x02” if it is
used
RK> as a placeholder. My 0.02EUR.
RK> Kind regards
First of all, I’m a noob. I should have said that.
Thus your solution
(1) Makes sense.
(2) Explains a lot about Ruby.
For both I thank you.
In terms of normalizing the database, I ran some timing tests and
normalizing it slows me down a couple of orders of magnitude. I’m
willing to live with the lack of purity for the sake of performance.
On 22.11.2009 17:05, Marnen Laibow-Koser wrote:
normalizing it slows me down a couple of orders of magnitude. I’m
willing to live with the lack of purity for the sake of performance.
A couple orders of magnitude? This suggests to me that your
normalization and/or queries are poorly done. What is chewing up so
much time, and what does the normalized schema look like? It should be
possible to find a solution that performs reasonably and doesn’t have 20
repeating fields.
I second that. Ralph, any information about the slowness? Is it in the
database or outside?
Kind regards
robert
Ralph S. wrote:
[…]
RK> ? Btw, having columns with indexes in their name is considered bad
RK> schema design because you either could normalize it into another
table
RK> and join or provide more meaningful names instead of “x02” if it is
used
RK> as a placeholder. My 0.02EUR.
RK> Kind regards
First of all, I’m a noob. I should have said that.
Thus your solution
(1) Makes sense.
(2) Explains a lot about Ruby.
For both I thank you.
In terms of normalizing the database, I ran some timing tests and
normalizing it slows me down a couple of orders of magnitude. I’m
willing to live with the lack of purity for the sake of performance.
A couple orders of magnitude? This suggests to me that your
normalization and/or queries are poorly done. What is chewing up so
much time, and what does the normalized schema look like? It should be
possible to find a solution that performs reasonably and doesn’t have 20
repeating fields.
Best,
Marnen Laibow-Koser
http://www.marnen.org
[email protected]
Wednesday, November 25, 2009, 10:35:06 AM, you wrote:
RK> On 22.11.2009 17:05, Marnen Laibow-Koser wrote:
normalizing it slows me down a couple of orders of magnitude. I’m
willing to live with the lack of purity for the sake of performance.
A couple orders of magnitude? This suggests to me that your
normalization and/or queries are poorly done. What is chewing up so
much time, and what does the normalized schema look like? It should be
possible to find a solution that performs reasonably and doesn’t have 20
repeating fields.
RK> I second that. Ralph, any information about the slowness? Is it in
the
RK> database or outside?
I think it is, in part, coming from the logging that is being done.
Instead of one create … one can have 21.
Ralph S. wrote:
Wednesday, November 25, 2009, 10:35:06 AM, you wrote:
RK> On 22.11.2009 17:05, Marnen Laibow-Koser wrote:
normalizing it slows me down a couple of orders of magnitude. I’m
willing to live with the lack of purity for the sake of performance.
A couple orders of magnitude? This suggests to me that your
normalization and/or queries are poorly done. What is chewing up so
much time, and what does the normalized schema look like? It should be
possible to find a solution that performs reasonably and doesn’t have 20
repeating fields.
RK> I second that. Ralph, any information about the slowness? Is it in
the
RK> database or outside?
I think it is, in part, coming from the logging that is being done.
Instead of one create … one can have 21.
Then, as I suspected, you are using the DB incorrectly. To create 21
records, use 1 query, not 21. You can use ar-extensions or something
similar to do this with ActiveRecord.
Rule of thumb: queries don’t belong inside loops. If you find yourself
putting a query in a loop, you have a design problem.
Best,
Marnen Laibow-Koser
http://www.marnen.org
[email protected]