Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google The Internet Technology

Google Incorporates Site Speed Into PageRank Calculation 202

lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."
This discussion has been archived. No new comments can be posted.

Google Incorporates Site Speed Into PageRank Calculation

Comments Filter:
  • Asking (Score:4, Funny)

    by Anonymous Coward on Sunday April 11, 2010 @11:27AM (#31808614)

    Not that Google asked me.

    Well, now they know that you're an influential Slashdot contributor I'm sure they'll sit up and take notice.

  • Slashdot (Score:5, Insightful)

    by SimonTheSoundMan ( 1012395 ) on Sunday April 11, 2010 @11:28AM (#31808616)

    So when a site gets slashdotted and blown to oblivion, Google also ranks it lower. Awesome!

    • One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?

      • Re:Slashdot (Score:4, Informative)

        by DNS-and-BIND ( 461968 ) on Sunday April 11, 2010 @11:38AM (#31808730) Homepage
        Google Bot is always indexing your site. I push 8-10Gb of traffic a month (yeah I know it's not a lot, thanks for informing me) and of that, 1Gb is Google. I don't know why Google constantly loads my pages even though they don't change that much, but Google does it.
        • Re:Slashdot (Score:5, Insightful)

          by loufoque ( 1400831 ) on Sunday April 11, 2010 @12:20PM (#31809076)

          Maybe if you correctly used Last-Modified and Etag headers with a 304 Not Modified response, you could avoid a significant part your bandwidth usage.

          • Re:Slashdot (Score:5, Insightful)

            by DNS-and-BIND ( 461968 ) on Sunday April 11, 2010 @02:22PM (#31810206) Homepage
            So, a guy with a 8Gb per month website is now required to have a server with root access and his own copy of apache, which he will then tune like a whistle without even having to read the documentation? Come on, man. Shared host. Non-expert admin that doesn't even like to mess with his Drupal install now that it's working, for fear of breaking it in some subtle yet damaging way. But thanks for shitting all over a novice with that nasty tone, though.
            • So, a guy with a 8Gb per month website is now required to have a server with root access and his own copy of apache, which he will then tune like a whistle without even having to read the documentation?

              Apache (or any other half-decent http server) does it by default for static files.

              For dynamic files (PHP, Python, CGI, whatever) obviously it cannot. But then if it is your code, you can easily do it yourself (buffer your page content, crc it, and use it as an etag. If it's the same as if-none-match, don't se

            • by Nick Ives ( 317 )

              Perhaps the Drupal XML Sitemap [drupal.org] module would help? It's just a module to implement the sitemap.xml protocol mentioned above. It'd involve messing with your setup but has the potential to save you a lot of bandwidth.

            • Re: (Score:3, Informative)

              What kind of shared host gets on you for 8GB/mo? Almost all of the shard hosting providers I've seen provide at least 50GB/mo.

              Maybe you should use a better hosting provider, figure out how to add the correct code to robots.txt, or use Google's webmaster tools (which are quite easy to use).

              And, FYI, pretty much any web language can set headers. You don't need to have root access to do it. You don't need to modify Apache configs.

              Most blogging, CMS, or forum software already handles this correctly. If you don'

            • Re: (Score:3, Insightful)

              by vegiVamp ( 518171 )
              As if you can't do those modifications from the .htaccess or in your code.

              There's no shame in being a novice, only in hiding behind being a novice because you don't want to read the documentation on the things you whine about.
        • Re:Slashdot (Score:5, Informative)

          by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Sunday April 11, 2010 @12:30PM (#31809192)

          You can associate your site with a Google account and override their heuristic.

      • Re:Slashdot (Score:5, Interesting)

        by Stan Vassilev ( 939229 ) on Sunday April 11, 2010 @11:39AM (#31808748)

        One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?

        If you use Google Webmaster Central you may notice that, while Google's algorithm is smart, it's also very overestimated in some areas, and involves plenty of manual tweaking by the Google employees for it to work properly.

        Site Speed is not calculated solely from the times the Google bot takes to crawl the page, it's calculated from Google toolbars that have the pagerank feature enabled (that feature calls home which sites you visit, and how fast the page got loaded).

        Whether Google can detect clusters of frequent accesses such as from "slashdotting" is entirely under question, since most slashdot users may not have google toolbar with pagerank on, but for the *few* users that do, the site will just appear slow in general.

        Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.

        Additionally yet, often the reason a site is slow is somewhere along the route, nowhere close to either the visitor ISP, not the site server, and it's not for all users either. So if you have bad luck or due to your content you pick up users that happen to often be routed through the bad route, you'll lose page rank.

        • Hmmm, indeed. Maybe they need to rework that feature so that if it passes Y!Slow and similar, it's considered as quick as it's going to get. Otherwise you'll indeed see sites that have no recourse penalised. You make an excellent point.

        • Re: (Score:2, Informative)

          by Sechr Nibw ( 1278786 )

          Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.

          Except they (according to the summary, didn't RTFA) aren't going by page load times, they're going by server response. That means that pages that are poorly written and take forever to load (or are connected to slow ad-servers) won't get downranked because of that. Only ones with slow server times. The slashdot effect will still potentially impact it, but the speed of a user's internet connection makes little impact on the speed of a ping.

  • How about bloat? (Score:3, Interesting)

    by GrumblyStuff ( 870046 ) on Sunday April 11, 2010 @11:32AM (#31808662)

    If site A and site B have the same info, then how about weighing which one has the info spread over 10 pages with 3-4 different adservers spewing flash and gifs and all sorts of javascript trickery and which one doesn't (or has less at least)?

    • Re: (Score:3, Insightful)

      You want Google to discriminate against sites with more advertising? Good luck with that, buddy.
      • Re: (Score:3, Insightful)

        by sjames ( 1099 )

        They would probably be happy to discriminate against the sites with more non-Google advertising. As long as Google keeps their ad-servers speedy, they could even justify it.

    • Google says it's only affecting a tiny fraction of sites anyway, the quality of the content still means more, and it only affects those searching in English on Google.com.

      It's probably still an experiment and may go away but with websites becoming more than plain text I'm glad to see performance taken into account.
  • I suppose an obvious question to ask then is: from where is Google measuring site speed? From a single particular server/location (presumably in the US)? From the 'nearest Google datacentre/server farm' to the site (and if so, how do they determine this)?

    If they are measuring site speed from a single (US) location, that's gotta be hurting the page rankings for any sites hosted outside the US, as even if those pages are lightning fast locally, you're always going to have that ~100 ms latency to Europe / ~150

    • by caffeinemessiah ( 918089 ) on Sunday April 11, 2010 @11:40AM (#31808752) Journal
      Geez, will you at least RTFS?

      Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.

      The main site serves visitors from the US. Thus, measuring speeds from multiple locations around the US is probably the best thing to do. They're presumably measuring speed from all their datacenters (their crawlers are likely to be distributed across the country (and world), so recording the average speed over multiple crawls would be a good approximation when you're dealing with the scale of Google and the Web).

      • It didn't say they are only implementing this on the US site. They said searches in English from Google.com.

        Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate). And depending on how the browser's been set up, those searches generally get pushed to google.com (the main site). The result page may be redirected to the country-specific one, but if

        • Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate)

          Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.

          • Safari searches on google.com irrespective of locale. I filed a bug report about this five years ago - it was marked as a duplicate but has still not been fixed.
            • by jo_ham ( 604554 )

              Is that on OS X or Windows?

              On OS X, typing "google.com" into the url bar and hitting enter redirects you to google.co.uk.

              Typing a search into the search box at the top right sends you to a results page from google.co.uk.

              Perhaps it it based on International settings in the pref pane?

              • I take that back. Apparently they have fixed it, I just didn't notice. It's quite recent though, because I was still having to tweak the URL manually a few months ago when I wanted to search for things in the UK.
          • Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.

            My FireFox sends me to google.com. Which is quite annoying because it means I get the US version of Froogle if I use the "Shopping" link.

            But more annoying is the way that Google don't implement some features in every language. E.g. if I want to turn SafeSearch off, I have to switch to English because the Welsh version of the preferences page doesn't have the damned SafeSearch options...

            • My FireFox sends me to google.com.

              That's odd. Do you live in the UK?

              I did my test on Firefox 3.6.3.

              I notice that if I explicitly visit google.com, it automatically redirects me to google.co.uk. Perhaps it's related to one's ISP and location data (or guesswork)?

              • That's odd. Do you live in the UK?

                Yep. FireFox 3.5.8 running under Fedora 12 and my locale is set to "cy_GB.UTF-8".

                I notice that if I explicitly visit google.com, it automatically redirects me to google.co.uk. Perhaps it's related to one's ISP and location data (or guesswork)?

                Maybe. google.com doesn't redirect for me, but I'm on a UK ISP (EntaNet) and the geoip stuff tends to claim I am in London (which is incorrect, but at least its still in the UK).

    • They did say that only 1% of sites are affected. That leads me to believe they have a pretty generous threshold (like several seconds or more). At that point, 200 ms more or less wouldn't make much difference.

      • 200ms is the difference in round trip time, not the difference in loading time. A RTT difference of 200ms can easily add a couple of seconds to the total loading time because an HTTP session involves the initial TCP connection setup, then the server transmitting something, then the client making the request, then the server sending the reply. The RTT also affects the maximum transfer speed due to the TCP rate limiting algorithm, so this is penalising sites with a lot of content on one page, if it's waitin
    • The speed ranking could be entirely location based.

      It's been a talking point for a while for webhosts in Ireland that Google ranks sites more highly if they are based in the same locale/country as the user making requests. In other words(they claim), it's worth paying more to host your site with a local provider than getting a deal with a big overseas web-hosting company. Now they would say that; but having seen my share of generic search results return local companies again and again, I'm inclined to think

      • Yeah I've noticed this in Australia as well. Makes sense though ... it's more likely that someone in Australia would find more relevant information on a .au site, especially if the subject matter is something that varies between countries. E.g. if you looked up "tax law" or "drivers license" or something...

  • In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed -- that's why we've decided to take site speed into account in our search rankings.

    Search speed is almost 100% subjective. Heck...are the tools Google is using to evaluate speed openly known to all that matter in this?

    I predict trouble ahead.

    • by yakatz ( 1176317 )

      I am not sure if this is how they measure, but Google makes some tools for site owners to check speed and it would be logical that they use something like this.
      http://code.google.com/speed/page-speed/
      (Similar to YSlow)

      Also, Google Webmaster Tools has a Site Performance section (under labs) which may have something to do with this.

  • especially the corporate run sites have become loaded with shit. flash, javascript ads, javascript code that tries to get all kinds of info from me in order to deliver it to the advertisers, banner ads, includes from numerous other sites, their javascript, this that, a lot of loaded shit. some can even clog your browser if they chance up in a particular moment.

    it will be good. now they will need to weigh speed factor. i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the act

    • i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.*

      * unless that 3rd party is Google, of course

    • i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.

      This is something that was pretty much fixed by XHTML. However, like most of the other good things introduced in XHTML, HTML 5 has chucked out that fix so we're back to the bad old days.

      The premise is basically: Javascript can use document.write() to insert markup at the script's location. This means that the browser _has_ to execute each piece of javascript at the point is is seen since any of the JS could use document.write() and the browser would need to parse that in context. This gets worse when the

  • by FuckingNickName ( 1362625 ) on Sunday April 11, 2010 @11:34AM (#31808690) Journal

    ...close to and prioritising Google. Gotcha.

    Really, am I the only one to find Google a fairly poor *find* engine? I mean, for anything which might remotely come close to sounding like it's a product, you've got Wikipedia right at the top, followed by 1000 review/comparison/pricing sites. For a tech question, you have expert-sexchange and 1000 crappy forums with responses from the downright wrong to the gratuitously abusive. I barely use Google (or any search engine much) for their generic WWW search - I'm more likely to be +site: searching a specific newsgroup/support forum/journal/enthusiast site I already know has intelligence. I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.

    • ...close to and prioritising Google. Gotcha.

      That'd do nothing. Speed isn't detect via the Google bots or from the google servers

    • by GIL_Dude ( 850471 ) on Sunday April 11, 2010 @11:52AM (#31808868) Homepage
      You hit the nail on the head with that one. I, too, find myself using queries containing "site:msdn.microsoft.com (rest of search)" (for say Windows API information) or using "-" in the searches to suppress certain results. Like you say, otherwise you get basically "a bunch of crap" - mainly from people who have no idea what they are doing. Just today I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder. I tried searching that on Google - hell half of the hits were stupid posts of every file on a system at malware check sites, or bleepingcomputer.com, or other "is this malware" posts. Did I say half? Shoot - I just checked again and I think I meant 85%. The results for most tech searches are indeed useless unless you already know what site you want and include that information in your search. The internet is just filled with crap sites that make it into the indexes and get high relevance.
      • I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder.

        thanks for the tip, i've had similar problems but as i installed the virtual clone drive and a new logitech 9000 series webcam/microphone on the same day i've been contemplating my pc's navel trying to guesstimate which one caused the problem while being too lazy to run it down. the more so since i also installed anydvd and read it installs as a driver.

      • That is true. What Google needs to do is make it so you can block something from coming up in any search. I hate seeing Rose India results in my Java searches. I remove them when I find them but if I search for something else it can come up again.

        That said I do believe the results are getting better. People find ways to trick Google and they get away with it for a bit but Google does catch on. I just checked for a search on Java servlet tutorial and Rose India ranks much lower than they used to and they'
      • You hit the nail on the head with that one. I, too, find myself using queries containing "site:msdn.microsoft.com (rest of search)" (for say Windows API information) or using "-" in the searches to suppress certain results. Like you say, otherwise you get basically "a bunch of crap" - mainly from people who have no idea what they are doing. Just today I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder. I tried searching that on Google - hell half of the hits were stupid posts of every file on a system at malware check sites, or bleepingcomputer.com, or other "is this malware" posts. Did I say half? Shoot - I just checked again and I think I meant 85%. The results for most tech searches are indeed useless unless you already know what site you want and include that information in your search. The internet is just filled with crap sites that make it into the indexes and get high relevance.

        I think Google seems pretty good with providing discussions of particular error messages. Just copy and paste it into the search field and you'll see a few hits on threads where the problem is at least explained and maybe resolved too. Invaluable for any programmer/sys admin who deals regularly with new technologies.

    • by Yvanhoe ( 564877 )
      Google is still the best tool to do a search on the whole internet.
      If you have another solution to find an answer to a tech question than checking the 40 first google entries, I am more than willing to check it out, but as crappy as one might call it, I have the feeling that it remains the best.
    • . I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.

      I think that would be giving too much power and responsibility to a tiny number of people who have no accountability to anyone except google.

      But a better idea might be to allow the user to interact with the search results page, moderating the results and flagging results that weren't helpful.

      • But a better idea might be to allow the user to interact with

        Anonymous users, and teams of users operating under declared banners (acting independently of Google, but using an interface provided by Google). Such teams compete to provide the best filters, where there will inevitably be different segments of the population having different opinions on what's shit and what's not. Add possibility to filter results by one or more teams, with metalists.

        For example:
        - Anti-Wikipedia-clone teams, which identify all clones of Wikipedia;
        - Anti-paywall teams;
        - Anti-comparison/re

    • I usually find a relevant link right at the top for products but if you're not then click the shopping link and it is nothing but products. You may not like the results but my guess is the results are spot on otherwise it wouldn't be number one. Keep in mind some people do want price comparisons or just plain info, like you may find on wikipedia.
    • by sjames ( 1099 )

      My absolute favorite is when I try to Google the answer to a tech question and can't find it due to the crapflood of smart asses telling people to just Google it. Meanwhile, since "Google it" is taken to be the answer, nobody else in the forum answered it either.

    • A lot of people will hate me for this, but I used Bing to lookup a problem I was having with my sound card and the first result was a blog post from a guy with the exact same issue and the solution. Google turned up a bunch of forum posts with tangentially related issues, but not the blog post that bing came up with.
  • Obviously needs to be refined, but in principle it's not as bad as it looks. There are a lots of queries when you'd rather have a big company's site in the first page of results, rather than an obscure blog or scam site. Discovering how much they wish to pay for bandwidth is a good method to tell them apart.
    • by dingen ( 958134 )
      The thing is though that most obscure blogs or scam sites tend to load a lot faster than sites of big corporations.
    • by amorsen ( 7485 )

      There are a lots of queries when you'd rather have a big company's site in the first page of results, rather than an obscure blog or scam site.

      You assume that big companies can afford powerful web servers and fast lines.

      I offer you HP and Cisco who seem to be hosted on the same Commodore 64 in Timbuktu on a GPRS line.

  • Slowbotted (Score:5, Funny)

    by Naatach ( 574111 ) on Sunday April 11, 2010 @11:35AM (#31808706)
    Yay! I can DDoS my competitors and have Google endorse it!
  • So, now well-connected sites run by media companies will have more relevance in Search results vs. minority opinions put out on a cheap web host?
    'Do no evil' is meaningless if you don't actually examine what you are doing.
  • This is clearly an effort to give precedence to commercial enterprises and advertisers. Take a link farm. Nothing really there, so it does not require much power to serve pages. Pages will load quickly, and, coincidently, generate revenue for Google. For legitimate businesses, those that can afford network optimizations are exactly those that will also pay for ads. OTOH, web sites that provide useful services but are slow are going to, eventually, be left in the dust. More link farms, fewer useful ser
    • Possibly, but remember what Google also said: the speed has only a very low weight compared to relevance. So if sites A and B have almost equal relevance then speed might determine the results but if A gets 98% for relevance and B gets 90% then the fact that B is twice as fast as A will be pretty much ignored when ranking them.

    • The quality of the content still matters more so link farms don't win just by being faster.
  • by erroneus ( 253617 ) on Sunday April 11, 2010 @11:41AM (#31808770) Homepage

    Does this help do battle against spam/scam sites? Yes.

    Does this help hosts of original content? Maybe... maybe not.

    Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.

    After all, those seeking to act against Google's interests by lowering speed and throughput to and from Google would automatically get a lower rank. Think about some of the newspapers out there who can't get over their aging business model. Think about other sources of information who might also be a competitor of Google in other markets? At the moment, Google is the primary source for lots of people.

    I must admit, I am having some difficulty coming up with arguments against this idea but I can't help but get a slightly uneasy feeling about this just the same.

    • by rockNme2349 ( 1414329 ) on Sunday April 11, 2010 @12:07PM (#31808970)

      Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.

      It sounds to me like a push completely against net-neutrality. The websites that are served up faster get a higher rank. The websites that are throttled get a lower rank. Net neutrality isn't about how website owners filter their bandwidth for their visitors; they've always been free to do what they want. Net neutrality is about the ISPs and other backbone entities of the internet throttling traffic. If there was an ISP between google and two webpages it could direcly influence their ranks by throttling the site it wants knocked down and prioritizing the site it wants to give a higher rank to.

      • > If there was an ISP between google and two webpages...

        A big if. Google _is_ a backbone. Besides, when you find out you are being throttled you can move.

  • "our users place a lot of value in speed"
    is not my opinion in the least, personally I like quality over speed.

    • by thetoadwarrior ( 1268702 ) on Sunday April 11, 2010 @01:30PM (#31809700) Homepage
      Good thing the quality of content is still king. However if you have two sites that rank at 100% and one is significantly faster then it comes up first. I can't see how that is bad thing.

      Google says this affects a tiny fraction of sites and let's face it, it will be irrelevant when comparing two text only sites. But with the growing web app trend then yes speed does make a difference. If you want to use an online Office replacement, like offline software, you don't want to sit there waiting for things to happen. Online games, like offline, can succeed or fail based on their speed.

      The guys at Google aren't dumb. I doubt we'll see pages punished for loading in 100 ms rather than 50ms. However if you take 2 minutes compared to 50 ms then you may be if your content isn't the best and quite frankly I'm happy with that.
    • RTFS: "While site speed is a new signal, it doesn't carry as much weight as the relevance of a page"

      Also, "our users place a lot of value in speed" does not necessarily conflict with "quality over speed".

  • From a slightly older article [blogspot.com] on the same blog:

    The load time data is derived from aggregated information sent by users of your site who have installed the Google Toolbar and opted-in to its enhanced features.

    So this isn't quite as susceptible to people playing games with Googlebot as it might appear.

  • Bad (Score:2, Insightful)

    by dandart ( 1274360 )
    If another site pretends to be me or tries to sell products that sound like my product, and have more money than me to spend on servers, and are closer to Google, Google will redirect people to them instead of me. Bad move.
  • by fluffernutter ( 1411889 ) on Sunday April 11, 2010 @12:27PM (#31809150)
    I occasionally put websites together for small businesses and it seems increasingly hard to get these kinds of websites known. Google seems to be more and more indexing websites with lots of content and now with speedier response which will completely slant their rankings towards large companies with huge resources.

    For example, I did a website for a lady that sells garden and landscaping lighting local to where I am from. Her business focus is not one that needs a large web page, she just wants her catalog to display basically but she does want people to find her with Google. I've done all the things like making sure the title is accurate and headers are relevant, etc. However, it seems to me that much of it is futile. Unless she is the type of business that focuses on inviting people to add content to her site (in other words an internet/web business) the sad truth is that she will basically get ignored by Google.
    • If it is a simple site then why is it taking so long to load? Page loads should be nothing. I had VBulletin running on a shared host (second cheapest option) and in theory it wasn't totally up to the job and even the largest page full of posts, with each preson having a sig image and avatar along with images in some posts took 2 seconds to load. I can't see her site being punished for speed unless it is poorly coded.

      Also if she is a local business then she'll almost certainly get better results when loca
    • Local search (Score:3, Informative)

      Unless a business is trying to serve the entire country, where they come in general topic searches does not matter that much. From a user's perspective, a broad general search IS best served by the largest and fastest sites.

      For small local businesses, you've got to tune for the locality, which includes a whole 'nother set of Google tools on top of the standard SEO stuff like title, content, meta tags, etc. http://www.google.com/local/add/ [google.com]

      She would also be well-served by using online tech to develop repeat c

  • First time my website [byuh.edu] has ever moved up in pagerank!! Lazy HTML FTW!!
  • I'm not sure how many of my readers have Google toolbar installed. Guess I can install it myself and visit my site for Google to get the page load speed data...
  • Where does TFA mention this is incorporated in PageRank (a very specific algorithm)?

    Google use hundreds of algorithms to determine the ranking of pages in result lists and my understanding, from talks Google staff have given, is that PageRank is used in only a tiny fraction of queries.

  • How long until I have to pay my webhost not to sandbag these measurements? Yippee...
  • Next we will see support for "copied and pasted" text, where the main content of one site is the same as another. I can imagine now in my results "likelihood of matching text as previous result: x%". This should help work out which pages are simply copied and pasted blogs, news or press releases.

  • Google must be stopped! Is taking advantage of its monopoly to... to... well, do good. At least from their point of view. Some sites are badly coded, not even try to be optimized, and speeding them up probably won't require a big investment, while will improve the experience for the visitors.

    But in the other hand, some sites by goals, general idea, location or popularity end being slow from google's point of view and gets punished, potentially being the authoritative in some topic. Could be mitigated a bit
  • by Anonymous Coward

    Google Site Speed is how well you have kept to the protocol specs to make
    sure the size of your website is as small as possible so as it travels through the
    pipes, it does so as efficiently as possible. It is NOT a rank of how fast your
    host provider delivers to the end user.

    Badly implemented pages will get a lower rank. (...and so they should IMHO)

    Google is trying to make sure everyone makes clean websites.
    I am sure Google also benefits by saving power/processing costs if the amount of
    kilobytes to parse/stor

  • by beakerMeep ( 716990 ) on Sunday April 11, 2010 @01:32PM (#31809714)
    Recently there used to be a feature to "ban" a result (like experts exchange) but they removed it in favor of only being able to "star" results you like. I'll have to say this seemed the single best feature they had ever added to search results. It was very useful to be able to identify (for myself) who was gaming the results. But apparently google thinks I'm better off with the safety of little pretty stars.
  • They've found a backdoor around it: either you pay for the new, expedited, luxury internet, or you and the other slow sites can molder on page 41 of a Google search.

  • by Animats ( 122034 ) on Sunday April 11, 2010 @03:38PM (#31810954) Homepage

    Does Google's measurement include delays from off-site ad servers? That's a big issue. For many sites (including Slashdot), the off-site ad servers are the big bottleneck.

    Web site programmers will now have to avoid ad code that delays page loading until the ads come in. I expect to see ad code that measures the response time of the ad server, and if the ad server doesn't respond fast enough, drops the ad and reports the fail to a monitoring site.

    Then we'll see sites gaming the system. If Google is using information from their "Google Toolbar" to affect search results, we'll probably see attempts to pump fake data into the Google Toolbar server. Google is going to have to learn the lesson well known to developers of networked games - "never trust the client".

  • by Lazy Jones ( 8403 ) on Monday April 12, 2010 @02:16AM (#31814500) Homepage Journal
    ... host your stuff at Google and get the PageRank boost. But that would be evil, no?

Computer programmers do it byte by byte.

Working...