Google Incorporates Site Speed Into PageRank Calculation 202
lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."
Re:Slashdot (Score:4, Informative)
Re:so, spammers just need servers... (Score:3, Informative)
What? I always scroll down to the bottom, way past all the crap about paying, and find it waaaay down below. Try scrolling further next time, or just use google's Cached page.
Re:so, spammers just need servers... (Score:5, Informative)
Half the people I heard from said that if they scroll all the way to the bottom they can read the answers for free, and the other half say this doesn't work. This confused me for the longest time until I finally figured out the answer.
Expertsexchange allows you to scroll down to the bottom to get a free answer the first time you visit their page, then gives your browser a cookie saying that you have gotten your free answer, and won't show you any more. So if you want to ensure that you can always scroll to the bottom, you simply have to block cookies from them and you are good to go.
Measured via the toolbar (Score:2, Informative)
From a slightly older article [blogspot.com] on the same blog:
So this isn't quite as susceptible to people playing games with Googlebot as it might appear.
Re:Slashdot (Score:5, Informative)
You can associate your site with a Google account and override their heuristic.
Re:Slashdot (Score:2, Informative)
Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.
Except they (according to the summary, didn't RTFA) aren't going by page load times, they're going by server response. That means that pages that are poorly written and take forever to load (or are connected to slow ad-servers) won't get downranked because of that. Only ones with slow server times. The slashdot effect will still potentially impact it, but the speed of a user's internet connection makes little impact on the speed of a ping.
Re:so, spammers just need servers... (Score:4, Informative)
But if you copy the URL and paste it on a browser, you don't get to see the answer at the bottom.
Personally what annoys me more than expertsexchange are the journal sites. For those I don't get the answer at the bottom or anywhere, even though it shows up in the Google search results.
Used to be Google policy that a site is not allowed to show different content to Google from what it shows to users - they smacked BMW Germany down for that. But now I see lots of sites getting away with that, and no, those journal sites don't get fooled by the user agent thing.
Perhaps they pay Google to be allowed to do it.
Re:Slashdot (Score:5, Informative)
Yes, but if only there was some open standard protocol he could use to tell Google how often to index which parts of his site, which Google implemented [sitemaps.org]...
Re:Slashdot (Score:3, Informative)
What kind of shared host gets on you for 8GB/mo? Almost all of the shard hosting providers I've seen provide at least 50GB/mo.
Maybe you should use a better hosting provider, figure out how to add the correct code to robots.txt, or use Google's webmaster tools (which are quite easy to use).
And, FYI, pretty much any web language can set headers. You don't need to have root access to do it. You don't need to modify Apache configs.
Most blogging, CMS, or forum software already handles this correctly. If you don't want to learn about handling caching or other HTTP issues, you might want to consider building on top of a framework that handles those issues for you.
Local search (Score:3, Informative)
Unless a business is trying to serve the entire country, where they come in general topic searches does not matter that much. From a user's perspective, a broad general search IS best served by the largest and fastest sites.
For small local businesses, you've got to tune for the locality, which includes a whole 'nother set of Google tools on top of the standard SEO stuff like title, content, meta tags, etc. http://www.google.com/local/add/ [google.com]
She would also be well-served by using online tech to develop repeat customers in other ways, like an e-mail newsletter, or engaging on sites like Yelp.