How to make Google Cache permalinks?

When I create a profile at LinkedIn - the profile content is actually
stored in database. When I pull page (my profile public page), I see
the content. Surely the content would be being pulled form database.
Yet, this public website page is cached by Google Search somehow.
Example - I enter my name in Google and my linkedin profile appears in
Google Search. How can I do this in rails?

I am working for a small social startup and we want to create public
profiles (like linkedin profiles) for NGOs which are returned by
google search.

The permalink is in format

http://websitename.org/ngos/<>

Any advise how it can be done?

If you’re asking how to get this info into google
http://www.google.com/support/webmasters/bin/answer.py?answer=35291

Typically you’re going to need to have links on a page to enable
crawling the content. OR you’re going to need to link back to your
site from other sites.

I’m not sure how linked in does this.

On 7 April 2011 18:02, UA [email protected] wrote:

When I create a profile at LinkedIn - the profile content is actually
stored in database. When I pull page (my profile public page), I see
the content. Surely the content would be being pulled form database.
Yet, this public website page is cached by Google Search somehow.
Example - I enter my name in Google and my linkedin profile appears in
Google Search. How can I do this in rails?

There’s nothing special going on here.

When you view the page in your web browser, the server-side software
constructs an HTML (using information from the database), and sends
the HTML to your browser. That HTML document is complete and makes
sense by itself; it has no reference or connection to the database
that the information originally came from. It is just like a static
HTML file.

So when Googlebot visits the web page, it gets the same, normal HTML
document, and it caches that.

Chris

Oh okay… and how can I promote or motivate Google to cache these
specific pages better / faster?
I know of some way … in case you know of more can you please advise:

Ways i know:

  1. Sitemap => Generate all of these URLs for sitemap
  2. Set up Google Analytics
  3. Follow WAI-ARIA http://www.w3.org/WAI/intro/aria.php standards
  4. http://validator.w3.org/

Any other things you can think off to motivate google to cache these
specific pages better?

On Apr 7, 2011, at 7:28 PM, UA wrote:

Any other things you can think off to motivate google to cache these
specific pages better?

Good old fashioned PR, often through old media rather than new media,
can really jump-start a Web site’s popularity. Popularity equals more
links in from trusted sources like news media sites, which in turn
raises your page rank, and your rise to the top of the search results
can begin.

I have one site that I built and use that appears to be indexed by
Google nearly constantly – to the point that a question I post to the
mailing list it echoes will turn up less than a half-hour later as one
of the top results if I google the same question. But this site has
thousands of virtual pages, has been around for years, is very focused
on a single topic, and is authoritative on that topic. All of those
things build the page rank and keep Google’s curiosity piqued such
that it keeps coming back to check for more new content to index.

Walter

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs