Sessions and search engine spiders

Hello -

In a default rails app, if cookies are turned off, Rails creates a
session file in /tmp for every single request. If my site gets
crawled by googlebot won’t this create thousands of sessions that
don’t need to be created?

I read the wiki regarding session management but there doesn’t seem to
be the middle ground of “don’t create a session unless I tell you to”
such as you have with servlets.

Any ideas on how to accomplish this would be appreciated.

Cheers
Mike

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs