Google Incorporates Site Speed Into PageRank Calculation 202
lee1 writes "Google is now taking into account how fast a page loads in calculating its PageRank. In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. ... While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.' Considering the increasing dilution of high-ranking results by endless series of plagiarizing 'blogs,' brainless forums, and outright scam sites, anything that further reduces the influence of the quality of the content is something I would rather not have. Not that Google asked me."
Asking (Score:4, Funny)
Not that Google asked me.
Well, now they know that you're an influential Slashdot contributor I'm sure they'll sit up and take notice.
Slashdot (Score:5, Insightful)
So when a site gets slashdotted and blown to oblivion, Google also ranks it lower. Awesome!
Re: (Score:2)
One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?
Re:Slashdot (Score:4, Informative)
Re:Slashdot (Score:5, Insightful)
Maybe if you correctly used Last-Modified and Etag headers with a 304 Not Modified response, you could avoid a significant part your bandwidth usage.
Re:Slashdot (Score:5, Informative)
Yes, but if only there was some open standard protocol he could use to tell Google how often to index which parts of his site, which Google implemented [sitemaps.org]...
Re:Slashdot (Score:5, Insightful)
Re: (Score:2)
Apache (or any other half-decent http server) does it by default for static files.
For dynamic files (PHP, Python, CGI, whatever) obviously it cannot. But then if it is your code, you can easily do it yourself (buffer your page content, crc it, and use it as an etag. If it's the same as if-none-match, don't se
Re: (Score:2)
Perhaps the Drupal XML Sitemap [drupal.org] module would help? It's just a module to implement the sitemap.xml protocol mentioned above. It'd involve messing with your setup but has the potential to save you a lot of bandwidth.
Re: (Score:3, Informative)
What kind of shared host gets on you for 8GB/mo? Almost all of the shard hosting providers I've seen provide at least 50GB/mo.
Maybe you should use a better hosting provider, figure out how to add the correct code to robots.txt, or use Google's webmaster tools (which are quite easy to use).
And, FYI, pretty much any web language can set headers. You don't need to have root access to do it. You don't need to modify Apache configs.
Most blogging, CMS, or forum software already handles this correctly. If you don'
Re: (Score:3, Insightful)
There's no shame in being a novice, only in hiding behind being a novice because you don't want to read the documentation on the things you whine about.
Re:Slashdot (Score:5, Informative)
You can associate your site with a Google account and override their heuristic.
Re:Slashdot (Score:5, Interesting)
One would think only if the Google Bot happens to be indexing your site at that exact moment; one would additionally think they'll revisit to see if it's structural or not?
If you use Google Webmaster Central you may notice that, while Google's algorithm is smart, it's also very overestimated in some areas, and involves plenty of manual tweaking by the Google employees for it to work properly.
Site Speed is not calculated solely from the times the Google bot takes to crawl the page, it's calculated from Google toolbars that have the pagerank feature enabled (that feature calls home which sites you visit, and how fast the page got loaded).
Whether Google can detect clusters of frequent accesses such as from "slashdotting" is entirely under question, since most slashdot users may not have google toolbar with pagerank on, but for the *few* users that do, the site will just appear slow in general.
Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.
Additionally yet, often the reason a site is slow is somewhere along the route, nowhere close to either the visitor ISP, not the site server, and it's not for all users either. So if you have bad luck or due to your content you pick up users that happen to often be routed through the bad route, you'll lose page rank.
Re: (Score:2)
Hmmm, indeed. Maybe they need to rework that feature so that if it passes Y!Slow and similar, it's considered as quick as it's going to get. Otherwise you'll indeed see sites that have no recourse penalised. You make an excellent point.
Goodbye home server! (Score:3, Interesting)
So those of us who host our own web servers from our DSL lines will be in that one percent.
Thanks Google. You really f* up my Day.
It took me a while to have my site at #1 based on the regional relevance of its content.
Now I might have to get dedicated hosting, just so that customers who would have previously
found my page right away won't go to the the other website which is out of their range but
has been around for 5 years longer than mine.
Re: (Score:3, Insightful)
Re: (Score:2, Informative)
Additionally, if a site targets a demographic that has worse latency (low income people, areas with dial-up and so on), then, again, that site will appear to be slower, while actually the visitors have slower internet in general.
Except they (according to the summary, didn't RTFA) aren't going by page load times, they're going by server response. That means that pages that are poorly written and take forever to load (or are connected to slow ad-servers) won't get downranked because of that. Only ones with slow server times. The slashdot effect will still potentially impact it, but the speed of a user's internet connection makes little impact on the speed of a ping.
Re:Slashdot (Score:5, Insightful)
If a server can't handle much load, it's probably not that important
Or it is a very informative hobbyist site with lots of useful info on it, which is comparatively slow compared to a well funded commercial site that has nothing but marketing-speak.
TFA says they are looking at "server response times", but I can't see this being at all useful unless they look at the total page load time (including all the ads that come off slow servers).
Slashdotting, power failure, tsumani, cleaning lady tripping over the network cables, poor server-side scripting, badly configured web server... What's the difference anyway?
The difference is that some of these problems are transitory and some are more permanent. You probably don't want transitory problems to affect the ranking (here's hoping they average it over several crawls).
Re: (Score:2)
The problem is that sites that are experiencing something like the /. effect are doing so because a lot of people are trying to access them, which is usually because they contain something that a lot of people want to see. Ranking them lower is the opposite of helpful.
The other problem with this is that it's very location dependent. A site in the USA takes noticeably longer to load for me than one in the UK, but in a lot of cases I'd rather see the one in the UK because it's locally relevant. If Google'
How about bloat? (Score:3, Interesting)
If site A and site B have the same info, then how about weighing which one has the info spread over 10 pages with 3-4 different adservers spewing flash and gifs and all sorts of javascript trickery and which one doesn't (or has less at least)?
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
They would probably be happy to discriminate against the sites with more non-Google advertising. As long as Google keeps their ad-servers speedy, they could even justify it.
Re: (Score:2)
It's probably still an experiment and may go away but with websites becoming more than plain text I'm glad to see performance taken into account.
Where is the 'speed' measured from? (Score:2)
I suppose an obvious question to ask then is: from where is Google measuring site speed? From a single particular server/location (presumably in the US)? From the 'nearest Google datacentre/server farm' to the site (and if so, how do they determine this)?
If they are measuring site speed from a single (US) location, that's gotta be hurting the page rankings for any sites hosted outside the US, as even if those pages are lightning fast locally, you're always going to have that ~100 ms latency to Europe / ~150
Re:Where is the 'speed' measured from? (Score:4, Funny)
Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.
The main site serves visitors from the US. Thus, measuring speeds from multiple locations around the US is probably the best thing to do. They're presumably measuring speed from all their datacenters (their crawlers are likely to be distributed across the country (and world), so recording the average speed over multiple crawls would be a good approximation when you're dealing with the scale of Google and the Web).
Re: (Score:2)
It didn't say they are only implementing this on the US site. They said searches in English from Google.com.
Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate). And depending on how the browser's been set up, those searches generally get pushed to google.com (the main site). The result page may be redirected to the country-specific one, but if
Re: (Score:2)
Most people in other English speaking countries (the UK/Australia/NZ/SA etc.) just search in their browser search bar rather than going to google.com manually (which would redirect them to .uk, .au, .nz as appropriate)
Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.
Re: (Score:2)
Re: (Score:2)
Is that on OS X or Windows?
On OS X, typing "google.com" into the url bar and hitting enter redirects you to google.co.uk.
Typing a search into the search box at the top right sends you to a results page from google.co.uk.
Perhaps it it based on International settings in the pref pane?
Re: (Score:2)
Re: (Score:2)
Both Firefox and IE 8 redirected me to google.co.uk (my appropriate local website) when I typed some nonsense into the search box.
My FireFox sends me to google.com. Which is quite annoying because it means I get the US version of Froogle if I use the "Shopping" link.
But more annoying is the way that Google don't implement some features in every language. E.g. if I want to turn SafeSearch off, I have to switch to English because the Welsh version of the preferences page doesn't have the damned SafeSearch options...
Re: (Score:2)
My FireFox sends me to google.com.
That's odd. Do you live in the UK?
I did my test on Firefox 3.6.3.
I notice that if I explicitly visit google.com, it automatically redirects me to google.co.uk. Perhaps it's related to one's ISP and location data (or guesswork)?
Re: (Score:2)
That's odd. Do you live in the UK?
Yep. FireFox 3.5.8 running under Fedora 12 and my locale is set to "cy_GB.UTF-8".
I notice that if I explicitly visit google.com, it automatically redirects me to google.co.uk. Perhaps it's related to one's ISP and location data (or guesswork)?
Maybe. google.com doesn't redirect for me, but I'm on a UK ISP (EntaNet) and the geoip stuff tends to claim I am in London (which is incorrect, but at least its still in the UK).
Re: (Score:2)
They did say that only 1% of sites are affected. That leads me to believe they have a pretty generous threshold (like several seconds or more). At that point, 200 ms more or less wouldn't make much difference.
Re: (Score:2)
Re: (Score:2)
The speed ranking could be entirely location based.
It's been a talking point for a while for webhosts in Ireland that Google ranks sites more highly if they are based in the same locale/country as the user making requests. In other words(they claim), it's worth paying more to host your site with a local provider than getting a deal with a big overseas web-hosting company. Now they would say that; but having seen my share of generic search results return local companies again and again, I'm inclined to think
Re: (Score:2)
Yeah I've noticed this in Australia as well. Makes sense though ... it's more likely that someone in Australia would find more relevant information on a .au site, especially if the subject matter is something that varies between countries. E.g. if you looked up "tax law" or "drivers license" or something...
Here's the problem (Score:2)
In their own words: '[W]e're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests. ... our users place a lot of value in speed -- that's why we've decided to take site speed into account in our search rankings.
Search speed is almost 100% subjective. Heck...are the tools Google is using to evaluate speed openly known to all that matter in this?
I predict trouble ahead.
Re: (Score:2)
I am not sure if this is how they measure, but Google makes some tools for site owners to check speed and it would be logical that they use something like this.
http://code.google.com/speed/page-speed/
(Similar to YSlow)
Also, Google Webmaster Tools has a Site Performance section (under labs) which may have something to do with this.
I care. (Score:2)
especially the corporate run sites have become loaded with shit. flash, javascript ads, javascript code that tries to get all kinds of info from me in order to deliver it to the advertisers, banner ads, includes from numerous other sites, their javascript, this that, a lot of loaded shit. some can even clog your browser if they chance up in a particular moment.
it will be good. now they will need to weigh speed factor. i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the act
Re: (Score:2)
i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.*
* unless that 3rd party is Google, of course
Re: (Score:2)
i shouldnt have to wait for a damn 3rd party ad provider's clogged servers to view the actual page im visiting.
This is something that was pretty much fixed by XHTML. However, like most of the other good things introduced in XHTML, HTML 5 has chucked out that fix so we're back to the bad old days.
The premise is basically: Javascript can use document.write() to insert markup at the script's location. This means that the browser _has_ to execute each piece of javascript at the point is is seen since any of the JS could use document.write() and the browser would need to parse that in context. This gets worse when the
so, spammers just need servers... (Score:5, Insightful)
...close to and prioritising Google. Gotcha.
Really, am I the only one to find Google a fairly poor *find* engine? I mean, for anything which might remotely come close to sounding like it's a product, you've got Wikipedia right at the top, followed by 1000 review/comparison/pricing sites. For a tech question, you have expert-sexchange and 1000 crappy forums with responses from the downright wrong to the gratuitously abusive. I barely use Google (or any search engine much) for their generic WWW search - I'm more likely to be +site: searching a specific newsgroup/support forum/journal/enthusiast site I already know has intelligence. I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.
Re: (Score:2)
...close to and prioritising Google. Gotcha.
That'd do nothing. Speed isn't detect via the Google bots or from the google servers
Re:so, spammers just need servers... (Score:5, Insightful)
Re: (Score:2)
I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder.
thanks for the tip, i've had similar problems but as i installed the virtual clone drive and a new logitech 9000 series webcam/microphone on the same day i've been contemplating my pc's navel trying to guesstimate which one caused the problem while being too lazy to run it down. the more so since i also installed anydvd and read it installs as a driver.
Re: (Score:2)
That said I do believe the results are getting better. People find ways to trick Google and they get away with it for a bit but Google does catch on. I just checked for a search on Java servlet tutorial and Rose India ranks much lower than they used to and they'
Re: (Score:2)
You hit the nail on the head with that one. I, too, find myself using queries containing "site:msdn.microsoft.com (rest of search)" (for say Windows API information) or using "-" in the searches to suppress certain results. Like you say, otherwise you get basically "a bunch of crap" - mainly from people who have no idea what they are doing. Just today I had a problem with elbyvcdshell.dll (from Slysoft's Virtual Clone Drive) causing Windows Explorer to hang for 5 minutes each time I renamed a folder. I tried searching that on Google - hell half of the hits were stupid posts of every file on a system at malware check sites, or bleepingcomputer.com, or other "is this malware" posts. Did I say half? Shoot - I just checked again and I think I meant 85%. The results for most tech searches are indeed useless unless you already know what site you want and include that information in your search. The internet is just filled with crap sites that make it into the indexes and get high relevance.
I think Google seems pretty good with providing discussions of particular error messages. Just copy and paste it into the search field and you'll see a few hits on threads where the problem is at least explained and maybe resolved too. Invaluable for any programmer/sys admin who deals regularly with new technologies.
Re: (Score:2)
If you have another solution to find an answer to a tech question than checking the 40 first google entries, I am more than willing to check it out, but as crappy as one might call it, I have the feeling that it remains the best.
Re: (Score:2)
. I don't need Google using yet another algorithm to fail at finding useful information - just employ 100 people spending 8 hours a day tagging the clone/spam/pricecheck/etc sites if you actually want to make a difference.
I think that would be giving too much power and responsibility to a tiny number of people who have no accountability to anyone except google.
But a better idea might be to allow the user to interact with the search results page, moderating the results and flagging results that weren't helpful.
Re: (Score:2)
But a better idea might be to allow the user to interact with
Anonymous users, and teams of users operating under declared banners (acting independently of Google, but using an interface provided by Google). Such teams compete to provide the best filters, where there will inevitably be different segments of the population having different opinions on what's shit and what's not. Add possibility to filter results by one or more teams, with metalists.
For example:
- Anti-Wikipedia-clone teams, which identify all clones of Wikipedia;
- Anti-paywall teams;
- Anti-comparison/re
Re: (Score:2)
Re: (Score:2)
My absolute favorite is when I try to Google the answer to a tech question and can't find it due to the crapflood of smart asses telling people to just Google it. Meanwhile, since "Google it" is taken to be the answer, nobody else in the forum answered it either.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
What? I always scroll down to the bottom, way past all the crap about paying, and find it waaaay down below. Try scrolling further next time, or just use google's Cached page.
Re:so, spammers just need servers... (Score:5, Informative)
Half the people I heard from said that if they scroll all the way to the bottom they can read the answers for free, and the other half say this doesn't work. This confused me for the longest time until I finally figured out the answer.
Expertsexchange allows you to scroll down to the bottom to get a free answer the first time you visit their page, then gives your browser a cookie saying that you have gotten your free answer, and won't show you any more. So if you want to ensure that you can always scroll to the bottom, you simply have to block cookies from them and you are good to go.
Re:so, spammers just need servers... (Score:4, Informative)
But if you copy the URL and paste it on a browser, you don't get to see the answer at the bottom.
Personally what annoys me more than expertsexchange are the journal sites. For those I don't get the answer at the bottom or anywhere, even though it shows up in the Google search results.
Used to be Google policy that a site is not allowed to show different content to Google from what it shows to users - they smacked BMW Germany down for that. But now I see lots of sites getting away with that, and no, those journal sites don't get fooled by the user agent thing.
Perhaps they pay Google to be allowed to do it.
Re: (Score:2)
It has always baffled me that a website that seems to have such a good supply of technical knowledge could be so incompetent at implementing a pay-wall.
Re: (Score:2)
I expect that their paywall works perfectly on 99.9% of their users.
Mostly by causing them to give up and go elsewhere, of course.
Re: (Score:2)
The partial answers need to show up in Google, so that people actually click through. If they had a proper pay-wall, Google wouldn't see the answers either. If they showed different content to the google search indexer, they would be removed from the index. What they are currently doing probably wouldn't be allowed from a smaller site, and they are skating on thin ice, but Google bends the rules for sites that bring in more google adsense revenue.
+1 Informative (Score:2)
Already posted, but thanks. That's the answer I was looking for. It makes much more sense now.
Re: (Score:2)
In expert-sex-change, you can find the answers just by scrolling down.
It looks like there's just a big footer under the question, but if you keep scrolling down, you'll find the answers.
Re: (Score:2)
Re: (Score:2)
For what its worth, I think I have only found relevant, usable info there about 3 times.
Re: (Score:2)
A "+0.5, Tough Love" mod would have been handy here.
Not as bad as it looks (Score:2)
Re: (Score:2)
Re: (Score:2)
There are a lots of queries when you'd rather have a big company's site in the first page of results, rather than an obscure blog or scam site.
You assume that big companies can afford powerful web servers and fast lines.
I offer you HP and Cisco who seem to be hosted on the same Commodore 64 in Timbuktu on a GPRS line.
Slowbotted (Score:5, Funny)
Net or Search Neutrality? (Score:2, Funny)
'Do no evil' is meaningless if you don't actually examine what you are doing.
bias for large advertisers (Score:2)
Re: (Score:2)
Possibly, but remember what Google also said: the speed has only a very low weight compared to relevance. So if sites A and B have almost equal relevance then speed might determine the results but if A gets 98% for relevance and B gets 90% then the fact that B is twice as fast as A will be pretty much ignored when ranking them.
Re: (Score:2)
That sounds reasonable.... so far (Score:5, Insightful)
Does this help do battle against spam/scam sites? Yes.
Does this help hosts of original content? Maybe... maybe not.
Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.
After all, those seeking to act against Google's interests by lowering speed and throughput to and from Google would automatically get a lower rank. Think about some of the newspapers out there who can't get over their aging business model. Think about other sources of information who might also be a competitor of Google in other markets? At the moment, Google is the primary source for lots of people.
I must admit, I am having some difficulty coming up with arguments against this idea but I can't help but get a slightly uneasy feeling about this just the same.
Re:That sounds reasonable.... so far (Score:5, Interesting)
Does this serve as an indirect or otherwise passive-aggressive push for network neutrality? I suspect it might be.
It sounds to me like a push completely against net-neutrality. The websites that are served up faster get a higher rank. The websites that are throttled get a lower rank. Net neutrality isn't about how website owners filter their bandwidth for their visitors; they've always been free to do what they want. Net neutrality is about the ISPs and other backbone entities of the internet throttling traffic. If there was an ISP between google and two webpages it could direcly influence their ranks by throttling the site it wants knocked down and prioritizing the site it wants to give a higher rank to.
Re: (Score:2)
> If there was an ISP between google and two webpages...
A big if. Google _is_ a backbone. Besides, when you find out you are being throttled you can move.
Re: (Score:2)
As a user of Google (Score:2)
"our users place a lot of value in speed"
is not my opinion in the least, personally I like quality over speed.
Re:As a user of Google (Score:4, Interesting)
Google says this affects a tiny fraction of sites and let's face it, it will be irrelevant when comparing two text only sites. But with the growing web app trend then yes speed does make a difference. If you want to use an online Office replacement, like offline software, you don't want to sit there waiting for things to happen. Online games, like offline, can succeed or fail based on their speed.
The guys at Google aren't dumb. I doubt we'll see pages punished for loading in 100 ms rather than 50ms. However if you take 2 minutes compared to 50 ms then you may be if your content isn't the best and quite frankly I'm happy with that.
Re: (Score:2)
RTFS: "While site speed is a new signal, it doesn't carry as much weight as the relevance of a page"
Also, "our users place a lot of value in speed" does not necessarily conflict with "quality over speed".
Measured via the toolbar (Score:2, Informative)
From a slightly older article [blogspot.com] on the same blog:
So this isn't quite as susceptible to people playing games with Googlebot as it might appear.
Bad (Score:2, Insightful)
Isn't Google missing the point? (Score:5, Interesting)
For example, I did a website for a lady that sells garden and landscaping lighting local to where I am from. Her business focus is not one that needs a large web page, she just wants her catalog to display basically but she does want people to find her with Google. I've done all the things like making sure the title is accurate and headers are relevant, etc. However, it seems to me that much of it is futile. Unless she is the type of business that focuses on inviting people to add content to her site (in other words an internet/web business) the sad truth is that she will basically get ignored by Google.
Re: (Score:2)
Also if she is a local business then she'll almost certainly get better results when loca
Local search (Score:3, Informative)
Unless a business is trying to serve the entire country, where they come in general topic searches does not matter that much. From a user's perspective, a broad general search IS best served by the largest and fastest sites.
For small local businesses, you've got to tune for the locality, which includes a whole 'nother set of Google tools on top of the standard SEO stuff like title, content, meta tags, etc. http://www.google.com/local/add/ [google.com]
She would also be well-served by using online tech to develop repeat c
Wow (Score:2)
I might just install Google Toolbar (Score:2)
Not pagerank? (Score:2)
Where does TFA mention this is incorporated in PageRank (a very specific algorithm)?
Google use hundreds of algorithms to determine the ranking of pages in result lists and my understanding, from talks Google staff have given, is that PageRank is used in only a tiny fraction of queries.
Net neutrality? (Score:2)
Copy and paste article searching? (Score:2)
Next we will see support for "copied and pasted" text, where the main content of one site is the same as another. I can imagine now in my results "likelihood of matching text as previous result: x%". This should help work out which pages are simply copied and pasted blogs, news or press releases.
Don't do evil (Score:2)
But in the other hand, some sites by goals, general idea, location or popularity end being slow from google's point of view and gets punished, potentially being the authoritative in some topic. Could be mitigated a bit
"Google Site Speed" is not the host provider speed (Score:2, Interesting)
Google Site Speed is how well you have kept to the protocol specs to make
sure the size of your website is as small as possible so as it travels through the
pipes, it does so as efficiently as possible. It is NOT a rank of how fast your
host provider delivers to the end user.
Badly implemented pages will get a lower rank. (...and so they should IMHO)
Google is trying to make sure everyone makes clean websites.
I am sure Google also benefits by saving power/processing costs if the amount of
kilobytes to parse/stor
What ever happened to "remove this result" (Score:4, Interesting)
Net neutrality is dead (Score:2, Offtopic)
They've found a backdoor around it: either you pay for the new, expedited, luxury internet, or you and the other slow sites can molder on page 41 of a Google search.
Off-site ad serving delay now a big issue (Score:3, Interesting)
Does Google's measurement include delays from off-site ad servers? That's a big issue. For many sites (including Slashdot), the off-site ad servers are the big bottleneck.
Web site programmers will now have to avoid ad code that delays page loading until the ads come in. I expect to see ad code that measures the response time of the ad server, and if the ad server doesn't respond fast enough, drops the ad and reports the fail to a monitoring site.
Then we'll see sites gaming the system. If Google is using information from their "Google Toolbar" to affect search results, we'll probably see attempts to pump fake data into the Google Toolbar server. Google is going to have to learn the lesson well known to developers of networked games - "never trust the client".
will they offer hosting / cloud computing soon? (Score:3, Funny)
Re:Sweet (Score:4, Insightful)
Mod parent up! Google is catching on to something us slashdotters have known for a long time. The person who posts fastest usually has the most insightful things to say!
I have an idea: Slashdot could easily incorporate average commenting speed into its UserRank and serve pages to excessively-first poster slowly, giving chance to other, more insightful readers, such as the humble me.
Re: (Score:3, Funny)
Re: (Score:2)