Faster datastructure for lookups wanted

Hi all,

maybe somebody can recommend me the right datastructure or
any other advice would be a big help.

My code spends most of its execution time doing lookups from
a hashtable with about 1M keys. The keys are strings and the values
are arrays of integers. Most of the time only of length 1.

I do not care how long the construction of the datastructure takes,
but the lookup should be as fast as possible.

xs.each{|x|
if found = hash[x]
#do sth.
end
}

Thanks a lot!
-Armin

On Fri, Sep 08, 2006 at 06:55:12AM +0900, [email protected] wrote:

maybe somebody can recommend me the right datastructure or
any other advice would be a big help.

My code spends most of its execution time doing lookups from
a hashtable with about 1M keys. The keys are strings and the values
are arrays of integers. Most of the time only of length 1.

I do not care how long the construction of the datastructure takes,
but the lookup should be as fast as possible.

It hardly gets faster than a Hash in Ruby.
You can also try a trie (Patricia tree if you have long keys and care
about
space), or Judy arrays (http://rjudy.sourceforge.net), but I wouldn’t
expect
major performance gains (Judy::JudySL, being more specialized than Hash,
might have a chance)…

My code spends most of its execution time doing lookups from
a hashtable with about 1M keys. The keys are strings and the values
are arrays of integers. Most of the time only of length 1.

I do not care how long the construction of the datastructure takes,
but the lookup should be as fast as possible.

Would putting it into a database be a possible solution? I may be wrong,
but
I’m pretty sure RDBMSes should be pretty quick to parse through and get
data. SQLite, PostgreSQL, or MySQL would be options.

Also, I’m thinking there may be a better method for finding the right
elements, though I’m not exactly sure what you’re doing.

M.T.

Mauricio F. wrote:

On Fri, Sep 08, 2006 at 06:55:12AM +0900, [email protected] wrote:

maybe somebody can recommend me the right datastructure or
any other advice would be a big help.

My code spends most of its execution time doing lookups from
a hashtable with about 1M keys. The keys are strings and the values
are arrays of integers. Most of the time only of length 1.

I do not care how long the construction of the datastructure takes,
but the lookup should be as fast as possible.

It hardly gets faster than a Hash in Ruby.
You can also try a trie (Patricia tree if you have long keys and care
about
space),

A Trie optimised by cutting off unambiguous traversal would
be a definite possibility.

    or Judy arrays (http://rjudy.sourceforge.net), but I wouldn't 

expect
major performance gains (Judy::JudySL, being more specialized than Hash,
might have a chance)…

[email protected] wrote:

but the lookup should be as fast as possible.
Any way to segment the data? IOW are 110^6 keys really needed in the
same
hash? A multi-tiered hash might save a lot of lookup time if the data
can
be broken into groups smaller than 1
10^6.

Example. If the strings are consistent about content, you could break
the
search into 26 (or 52) sections by simply hashing the first letter of
the
key strings, then hashing on the remainder of each string in the second
tier.

On Fri, Sep 08, 2006 at 08:28:33AM +0900, Eero S. wrote:
} Mauricio F. wrote:
} > On Fri, Sep 08, 2006 at 06:55:12AM +0900, [email protected] wrote:
} >> maybe somebody can recommend me the right datastructure or
} >> any other advice would be a big help.
} >>
} >> My code spends most of its execution time doing lookups from
} >> a hashtable with about 1M keys. The keys are strings and the values
} >> are arrays of integers. Most of the time only of length 1.
} >>
} >> I do not care how long the construction of the datastructure
takes,
} >> but the lookup should be as fast as possible.
} >
} > It hardly gets faster than a Hash in Ruby.
} > You can also try a trie (Patricia tree if you have long keys and
care
} > about
} > space),
}
} A Trie optimised by cutting off unambiguous traversal would
} be a definite possibility.

There is a trie gem that implements a Patricia Trie.

http://gemjack.com/gems/trie-0.0.1/classes/Trie.html

Of course, a Patricia Trie assumes no a priori knowledge of your string
inputs. If you know something about your keys, you may be able to do
better
with a hash of hashes (to however many layers is appropriate), splitting
as
appropriate for your key space. For example, if you know that your keys
are
IPv4 addresses that come in dotted quad notation (e.g. 127.0.0.1), you
could do better with (note: untested):

class SplittableHash
def initialize(split)
@split = split
@root = {}
end

def
key.split(@split).inject(@root) { |h,k| h[k] if h }
end

def []=(key, val)
path = key.split(@split)
key = path.pop
path.inject(@root) { |h,k| h[k] ||= {} }[key] = val
end

end

–Greg

[email protected] wrote:

but the lookup should be as fast as possible.

xs.each{|x|
if found = hash[x]
#do sth.
end
}

As others said already, a Hash is pretty much the fastest for the
general case. How do your string keys look like? Maybe it is worth
trying symbols instead of strings?

If you unveil a bit more about your application we might be able to come
up with more suggestions.

Kind regards

robert