Forum: Ruby on Rails sessions and search engine spiders

Announcement (2017-05-07): is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see and for other Rails- und Ruby-related community platforms.
Michael E. (Guest)
on 2006-01-07 00:10
(Received via mailing list)
Hello -

In a default rails app, if cookies are turned off, Rails creates a
session file in /tmp for every single request.   If my site gets
crawled by googlebot won't this create thousands of sessions that
don't need to be created?

I read the wiki regarding session management but there doesn't seem to
be the middle ground of "don't create a session unless I tell you to"
such as you have with servlets.

Any ideas on how to accomplish this would be appreciated.

This topic is locked and can not be replied to.