Google Incorporates Site Speed Into PageRank Calculation 202
lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."
How about bloat? (Score:3, Interesting)
If site A and site B have the same info, then how about weighing which one has the info spread over 10 pages with 3-4 different adservers spewing flash and gifs and all sorts of javascript trickery and which one doesn't (or has less at least)?
Re:Slashdot (Score:5, Interesting)
One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?
If you use Google Webmaster Central you may notice that, while Google's algorithm is smart, it's also very overestimated in some areas, and involves plenty of manual tweaking by the Google employees for it to work properly.
Site Speed is not calculated solely from the times the Google bot takes to crawl the page, it's calculated from Google toolbars that have the pagerank feature enabled (that feature calls home which sites you visit, and how fast the page got loaded).
Whether Google can detect clusters of frequent accesses such as from "slashdotting" is entirely under question, since most slashdot users may not have google toolbar with pagerank on, but for the *few* users that do, the site will just appear slow in general.
Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.
Additionally yet, often the reason a site is slow is somewhere along the route, nowhere close to either the visitor ISP, not the site server, and it's not for all users either. So if you have bad luck or due to your content you pick up users that happen to often be routed through the bad route, you'll lose page rank.
Re:That sounds reasonable.... so far (Score:5, Interesting)
Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.
It sounds to me like a push completely against net-neutrality. The websites that are served up faster get a higher rank. The websites that are throttled get a lower rank. Net neutrality isn't about how website owners filter their bandwidth for their visitors; they've always been free to do what they want. Net neutrality is about the ISPs and other backbone entities of the internet throttling traffic. If there was an ISP between google and two webpages it could direcly influence their ranks by throttling the site it wants knocked down and prioritizing the site it wants to give a higher rank to.
Re:Slowbotted (Score:1, Interesting)
Since Google is using the Google toolbar to measure the responsiveness; wouldn't it make more sense, and more efficient, to reverse engineer the protocol that Google is using, and submit faked response times to the Google server?
That is assuming that you have control over a bot-net that you are using to perform the DDoS attacks.
Isn't Google missing the point? (Score:5, Interesting)
For example, I did a website for a lady that sells garden and landscaping lighting local to where I am from. Her business focus is not one that needs a large web page, she just wants her catalog to display basically but she does want people to find her with Google. I've done all the things like making sure the title is accurate and headers are relevant, etc. However, it seems to me that much of it is futile. Unless she is the type of business that focuses on inviting people to add content to her site (in other words an internet/web business) the sad truth is that she will basically get ignored by Google.
"Google Site Speed" is not the host provider speed (Score:2, Interesting)
Google Site Speed is how well you have kept to the protocol specs to make
sure the size of your website is as small as possible so as it travels through the
pipes, it does so as efficiently as possible. It is NOT a rank of how fast your
host provider delivers to the end user.
Badly implemented pages will get a lower rank. (...and so they should IMHO)
Google is trying to make sure everyone makes clean websites.
I am sure Google also benefits by saving power/processing costs if the amount of
kilobytes to parse/store per web page is smaller.
Re:As a user of Google (Score:4, Interesting)
Google says this affects a tiny fraction of sites and let's face it, it will be irrelevant when comparing two text only sites. But with the growing web app trend then yes speed does make a difference. If you want to use an online Office replacement, like offline software, you don't want to sit there waiting for things to happen. Online games, like offline, can succeed or fail based on their speed.
The guys at Google aren't dumb. I doubt we'll see pages punished for loading in 100 ms rather than 50ms. However if you take 2 minutes compared to 50 ms then you may be if your content isn't the best and quite frankly I'm happy with that.
What ever happened to "remove this result" (Score:4, Interesting)
Goodbye home server! (Score:3, Interesting)
So those of us who host our own web servers from our DSL lines will be in that one percent.
Thanks Google. You really f* up my Day.
It took me a while to have my site at #1 based on the regional relevance of its content.
Now I might have to get dedicated hosting, just so that customers who would have previously
found my page right away won't go to the the other website which is out of their range but
has been around for 5 years longer than mine.
Off-site ad serving delay now a big issue (Score:3, Interesting)
Does Google's measurement include delays from off-site ad servers? That's a big issue. For many sites (including Slashdot), the off-site ad servers are the big bottleneck.
Web site programmers will now have to avoid ad code that delays page loading until the ads come in. I expect to see ad code that measures the response time of the ad server, and if the ad server doesn't respond fast enough, drops the ad and reports the fail to a monitoring site.
Then we'll see sites gaming the system. If Google is using information from their "Google Toolbar" to affect search results, we'll probably see attempts to pump fake data into the Google Toolbar server. Google is going to have to learn the lesson well known to developers of networked games - "never trust the client".
google analytics (Score:2, Interesting)
Funny, 'cause whenever I have a site loading slowly, I usually can look at the address bar and see it stuck on Google Analytics. Well, until I blocked it and greatly sped up the web, that is.