Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google The Internet Technology

Google Incorporates Site Speed Into PageRank Calculation 202

lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."
This discussion has been archived. No new comments can be posted.

Google Incorporates Site Speed Into PageRank Calculation

Comments Filter:
  • How about bloat? (Score:3, Interesting)

    by GrumblyStuff ( 870046 ) on Sunday April 11, 2010 @12:32PM (#31808662)

    If site A and site B have the same info, then how about weighing which one has the info spread over 10 pages with 3-4 different adservers spewing flash and gifs and all sorts of javascript trickery and which one doesn't (or has less at least)?

  • Re:Slashdot (Score:5, Interesting)

    by Stan Vassilev ( 939229 ) on Sunday April 11, 2010 @12:39PM (#31808748)

    One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?

    If you use Google Webmaster Central you may notice that, while Google's algorithm is smart, it's also very overestimated in some areas, and involves plenty of manual tweaking by the Google employees for it to work properly.

    Site Speed is not calculated solely from the times the Google bot takes to crawl the page, it's calculated from Google toolbars that have the pagerank feature enabled (that feature calls home which sites you visit, and how fast the page got loaded).

    Whether Google can detect clusters of frequent accesses such as from "slashdotting" is entirely under question, since most slashdot users may not have google toolbar with pagerank on, but for the *few* users that do, the site will just appear slow in general.

    Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.

    Additionally yet, often the reason a site is slow is somewhere along the route, nowhere close to either the visitor ISP, not the site server, and it's not for all users either. So if you have bad luck or due to your content you pick up users that happen to often be routed through the bad route, you'll lose page rank.

  • by rockNme2349 ( 1414329 ) on Sunday April 11, 2010 @01:07PM (#31808970)

    Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.

    It sounds to me like a push completely against net-neutrality. The websites that are served up faster get a higher rank. The websites that are throttled get a lower rank. Net neutrality isn't about how website owners filter their bandwidth for their visitors; they've always been free to do what they want. Net neutrality is about the ISPs and other backbone entities of the internet throttling traffic. If there was an ISP between google and two webpages it could direcly influence their ranks by throttling the site it wants knocked down and prioritizing the site it wants to give a higher rank to.

  • Re:Slowbotted (Score:1, Interesting)

    by Anonymous Coward on Sunday April 11, 2010 @01:26PM (#31809134)

    Since Google is using the Google toolbar to measure the responsiveness; wouldn't it make more sense, and more efficient, to reverse engineer the protocol that Google is using, and submit faked response times to the Google server?

    That is assuming that you have control over a bot-net that you are using to perform the DDoS attacks.

  • by fluffernutter ( 1411889 ) on Sunday April 11, 2010 @01:27PM (#31809150)
    I occasionally put websites together for small businesses and it seems increasingly hard to get these kinds of websites known. Google seems to be more and more indexing websites with lots of content and now with speedier response which will completely slant their rankings towards large companies with huge resources.

    For example, I did a website for a lady that sells garden and landscaping lighting local to where I am from. Her business focus is not one that needs a large web page, she just wants her catalog to display basically but she does want people to find her with Google. I've done all the things like making sure the title is accurate and headers are relevant, etc. However, it seems to me that much of it is futile. Unless she is the type of business that focuses on inviting people to add content to her site (in other words an internet/web business) the sad truth is that she will basically get ignored by Google.
  • by Anonymous Coward on Sunday April 11, 2010 @02:26PM (#31809658)

    Google Site Speed is how well you have kept to the protocol specs to make
    sure the size of your website is as small as possible so as it travels through the
    pipes, it does so as efficiently as possible. It is NOT a rank of how fast your
    host provider delivers to the end user.

    Badly implemented pages will get a lower rank. (...and so they should IMHO)

    Google is trying to make sure everyone makes clean websites.
    I am sure Google also benefits by saving power/processing costs if the amount of
    kilobytes to parse/store per web page is smaller.

  • by thetoadwarrior ( 1268702 ) on Sunday April 11, 2010 @02:30PM (#31809700) Homepage
    Good thing the quality of content is still king. However if you have two sites that rank at 100% and one is significantly faster then it comes up first. I can't see how that is bad thing.

    Google says this affects a tiny fraction of sites and let's face it, it will be irrelevant when comparing two text only sites. But with the growing web app trend then yes speed does make a difference. If you want to use an online Office replacement, like offline software, you don't want to sit there waiting for things to happen. Online games, like offline, can succeed or fail based on their speed.

    The guys at Google aren't dumb. I doubt we'll see pages punished for loading in 100 ms rather than 50ms. However if you take 2 minutes compared to 50 ms then you may be if your content isn't the best and quite frankly I'm happy with that.
  • by beakerMeep ( 716990 ) on Sunday April 11, 2010 @02:32PM (#31809714)
    Recently there used to be a feature to "ban" a result (like experts exchange) but they removed it in favor of only being able to "star" results you like. I'll have to say this seemed the single best feature they had ever added to search results. It was very useful to be able to identify (for myself) who was gaming the results. But apparently google thinks I'm better off with the safety of little pretty stars.
  • Goodbye home server! (Score:3, Interesting)

    by ipquickly ( 1562169 ) on Sunday April 11, 2010 @03:43PM (#31810422) Homepage

    So those of us who host our own web servers from our DSL lines will be in that one percent.

    Thanks Google. You really f* up my Day.

    It took me a while to have my site at #1 based on the regional relevance of its content.

    Now I might have to get dedicated hosting, just so that customers who would have previously
    found my page right away won't go to the the other website which is out of their range but
    has been around for 5 years longer than mine.

  • by Animats ( 122034 ) on Sunday April 11, 2010 @04:38PM (#31810954) Homepage

    Does Google's measurement include delays from off-site ad servers? That's a big issue. For many sites (including Slashdot), the off-site ad servers are the big bottleneck.

    Web site programmers will now have to avoid ad code that delays page loading until the ads come in. I expect to see ad code that measures the response time of the ad server, and if the ad server doesn't respond fast enough, drops the ad and reports the fail to a monitoring site.

    Then we'll see sites gaming the system. If Google is using information from their "Google Toolbar" to affect search results, we'll probably see attempts to pump fake data into the Google Toolbar server. Google is going to have to learn the lesson well known to developers of networked games - "never trust the client".

  • google analytics (Score:2, Interesting)

    by emkyooess ( 1551693 ) on Sunday April 11, 2010 @08:04PM (#31812422)

    Funny, 'cause whenever I have a site loading slowly, I usually can look at the address bar and see it stuck on Google Analytics. Well, until I blocked it and greatly sped up the web, that is.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...