Web search and robots

I was just wondering if a built in support for dynamic robots.txt
files and google sitemaps wouldn’t be nice.
I think that dynamically generating google sitemap files would really
help users of typo to get their page out to the masses and controlling
the robots.txt file could reduce load and keep unwanted content of the
search engines. For example there is no need for google to be always
searching the RSS feeds.

Having some control over it in the admin section is something I would
love. How about the rest?

Robots.txt info:
http://www.robotstxt.org/wc/robots.html
http://www.pageresource.com/zine/robotstxt.htm
Google Sitemaps info:
https://www.google.com/webmasters/sitemaps/docs/en/navigation.html
https://www.google.com/webmasters/sitemaps/docs/en/faq.html

Jon Gretar B.
http://www.jongretar.net/

just for the record i would totaly second and love such a feature.

-Kevin K.

On 4/11/06, Jon Gretar B. [email protected] wrote:


Jon Gretar B.
http://www.jongretar.net/


Typo-list mailing list
[email protected]
http://rubyforge.org/mailman/listinfo/typo-list


Cheers,
Kevin K.
http://blog.kubasik.net/

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs