Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Encryption Google The Internet

Google Will Give a Search Edge To Websites That Use Encryption 148

As TechCrunch reports, Google will begin using website encryption, or HTTPS, as a ranking signal – a move which should prompt website developers who have dragged their heels on increased security measures, or who debated whether their website was “important” enough to require encryption, to make a change. Initially, HTTPS will only be a lightweight signal, affecting fewer than 1% of global queries, says Google. ... Over time, however, encryption’s effect on search ranking [may] strengthen, as the company places more importance on website security. ... While HTTPS and site encryption have been a best practice in the security community for years, the revelation that the NSA has been tapping the cables, so to speak, to mine user information directly has prompted many technology companies to consider increasing their own security measures, too. Yahoo, for example, also announced in November its plans to encrypt its data center traffic.
This discussion has been archived. No new comments can be posted.

Google Will Give a Search Edge To Websites That Use Encryption

Comments Filter:
  • Great step! (Score:5, Interesting)

    by satuon ( 1822492 ) on Thursday August 07, 2014 @11:16AM (#47623339)

    That's a really great step from Google, I had never thought that it can be done in such a neat way. What's next? Can they also do it for IPv6?

    • by thieh ( 3654731 )
      IPv6 would be a bit problematic because not every ISP has IPv6 and it can mount to being discriminatory based on who you choose as ISP or simply where you live.
      • by satuon ( 1822492 )

        It could create an incentive to switch to a different ISP that supports it (where possible), which could in turn create an incentive for ISPs themselves to switch to IPv6.

      • We are talking about websites here not end user connections. Unlike with "broadband" ISPs there is plenty of competition in hosting providers.

        An incentive to website operators to tell their hosting providers "either you give me IPv6 or I go elsewhere" sounds find to me.

      • Down in the boondocks
        Down in the boondocks
        People put me down 'cause that's the side of town I was born in

        I love games, love to stream HD.
        But I crap for an ISP!
        Lord have mercy on a boy from down in the boondocks

        Every night I will watch the light from the house upon the hill
        They got 40megs a second in the whole damn place and it gives me such a thrill
        But I don't dare knock on their door,
        'cause I hacked their Master Card last year
        So I'll just have to be content
        To leach whenever I get near
        ...
        Apol
      • Its a incentive for websites to do both IPv4 and IPv6. Not IPv6 exclusively.
        Wouldn't affect end users at all.

    • by defaria ( 741527 )
      Add to this that sometimes the reputable companies still don't do the right thing. As a large financial institution you'd think that Wells Fargo of all companies would at least be sporting an extended validation certificate! But they don't. And that's not their only fumbling in security (http://defaria.com/WF). I have my reasons for still dealing with them but I watch them like a hawk!
    • That's a really great step from Google, I had never thought that it can be done in such a neat way. What's next? Can they also do it for IPv6?

      I have previously publically advocated for Google to do exactly this as a way to promote the adoption of IPv6 however I was wrong to go there.

      The bottom line is Google a defacto monopoly is using force to effect change in ways mostly unrelated to the mission (Linkage between quality and SSL is both domain dependent and strenuous at best).

      Just because I happen to think IPv6 adoption will benefit everyone or that wholesale spying on wires should yield as little fruit as possible does not mean the ends should

      • In terms of their mission, they do a lot of things to "promote a healthy internet". Suggesting that two equal websites and the one that uses SSL gets higher ranking would fall under "healthy".

  • by neilo_1701D ( 2765337 ) on Thursday August 07, 2014 @11:20AM (#47623371)

    I'm not convinced that this is a good precedent. Sure, they're encouraging sites to use HTTPS today... but what about tomorrow?

    Speculation: Websites that block competing search engines from indexing their content may rank higher in Google searches? Websites that process payments using Google rank higher in Google search?

    I'm not saying that HTTPS is a bad thing... but once they open the door once to arbitrary ranking changes done on a whim, that door can be opened again.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      As opposed to the currently non-arbitrary ranking algorithm? What the hell are you talking about.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Don't like it, use Bing!

    • They've already been using their ranking system to encourage HTTP and HTML. Think of all the poor BBSs and gopher servers they've been discriminating against!

      • by satuon ( 1822492 )

        I wish they would index real FTP servers, not just those with an HTTP interface.

        • Googlebot doesn't know the username and password to your real FTP server. What e-mail address should it be using for anonymous FTP?
          • by satuon ( 1822492 )

            I meant those FTP servers that allow anonymous login, of course.

          • asdf@ghj.com, of course. Just like everyone else.

          • root@ like everyone else?

      • Speak for yourself! I do most of my human interaction over FIDO. It's quite a bit slower (I only receive replies after a few weeks to months), but the fidelity is better than this Slashdot fad.

    • by bill_mcgonigle ( 4333 ) * on Thursday August 07, 2014 @11:43AM (#47623581) Homepage Journal

      Google has been using dozens of quality metrics for years to adjust its rankings. This isn't a new concept.

      It's not clear to me which HTTPS configurations it's favoring, though. Is Strict Transport Security a requirement? People with high-longevity system needs are going to need to upgrade to EL7 to make good HTTPS feasible, so there will be a transition period.

      As far as standards - look, W3C, IETF, et. al. have completely failed to keep up. From 1993 to 1997 we went from HTTP 0.9 to to HTTP 1.1, which is where we are today. HTTP 2.0 will have been languishing for two decades by time there's a standard and any significant adoption. That's not Internet-time.

      Google has made some mistakes with SPDY and QIC but at least they're actually trying to move the ball down the field instead of just arguing on the sidelines. It used to be that lots of players would do the same thing and fairly quickly a concensus would emerge. We have a serious breakage problem in the current community process. Google is doing it right - it's everybody else that's not.

      • by Anonymous Brave Guy ( 457657 ) on Thursday August 07, 2014 @12:04PM (#47623837)

        While your points about the snail's pace of web "standards" development are fair, it's also important not to go too far the other way. Not so long ago, another browser became dominant in market share through pushing new but not widely supported features its own way, and people started making web sites that were written specifically to work with that browser rather than any common standard.

        That browser was Internet Explorer in the late 1990s, and the result was IE6.

      • As far as standards - look, W3C, IETF, et. al. have completely failed to keep up. From 1993 to 1997 we went from HTTP 0.9 to to HTTP 1.1, which is where we are today.

        Most HTTP 1.1 features are useless. If it disappeared tomorrow nobody would care or even be able to tell it has gone missing.

        HTTP 2.0 will have been languishing for two decades by time there's a standard and any significant adoption. That's not Internet-time.

        The pace of standards development is driven by commercial need rather than abstract notions of staleness, "the future", "progress"..etc.

        The only reason for delay is nobody cares. The incremental benefit is so trivial as to not be worth the effort unless you happen to be Google. When people care shit gets done even if it means draft implementations making their way into production.

        Google has made some mistakes with SPDY and QIC but at least they're actually trying to move the ball down the field instead of just arguing on the sidelines.

        M

    • Speculation: Websites that block competing search engines from indexing their content may rank higher in Google searches? Websites that process payments using Google rank higher in Google search?

      Either of these actions would likely be illegal under federal anti-trust laws.

    • by just_another_sean ( 919159 ) on Thursday August 07, 2014 @12:31PM (#47624143) Journal

      but once they open the door once to arbitrary ranking changes done on a whim, that door can be opened again.

      Was that door ever closed? They're ranking algorithm has been arbitrary since the beginning and has changed very frequently over the years in an effort to reduce gaming the system and to generally improve results. If anything I'd say it's nice that they're at least telling people about this change vs. just quietly adjusting things and leaving site owners to wonder what happened to their page rank.

    • by mlts ( 1038732 )

      The EU would pound Google into component quarks if Google paired search weight to what sites used their services.

      Bumping search weights on HTTPS is a good thing for everyone. It locks out Phorm-like ad sites, lessens the damage that bogus Wi-Fi access points can do, and ensures that an attack on DNS doesn't take a site completely out... although DNS + a bogus CA can do the job.

  • by turkeydance ( 1266624 ) on Thursday August 07, 2014 @11:21AM (#47623379)
    i, Google, (corporations are legal individuals in USA) refuse to rank my response due to it's incriminating nature.
  • It's about time! (Score:5, Interesting)

    by mcrbids ( 148650 ) on Thursday August 07, 2014 @11:29AM (#47623453) Journal

    Expensive advertising campaigns engender trust because it shows that the advertiser has the resources to carry out the campaign. It's why online ads are so commonly ignored - people want to do business with "reputable" companies and expensive advertising is a way of establishing repute.

    Similarly, putting out the modicum of effort to perform basic security like SSL is a signal that the website is reputable. I mean, if you can't be bothered to buy a $50 SSL certificate and install it, are you *really* trustworthy?

    SSL should be a basic signal of trustworthiness.

    • by tepples ( 727027 ) <tepples AT gmail DOT com> on Thursday August 07, 2014 @11:42AM (#47623579) Homepage Journal

      I mean, if you can't be bothered to buy a $50 SSL certificate and install it, are you *really* trustworthy?

      It's not only the cost of a certificate, which StartSSL provides without charge to individuals. It's also a dedicated IPv4 address if you want to reach people still using Android 2 or Windows XP. A lot of entry-level hosting packages use name-based virtual hosting, and doing this over name-based virtual hosting requires the TLS stack to support Server Name Indication (SNI). Android Browser didn't gain support for SNI until Honeycomb (3.x) on tablets and ICS (4.0) on phones, and Internet Explorer didn't gain support for SNI until Windows Vista.

    • Re:It's about time! (Score:4, Interesting)

      by WaffleMonster ( 969671 ) on Thursday August 07, 2014 @01:06PM (#47624553)

      Similarly, putting out the modicum of effort to perform basic security like SSL is a signal that the website is reputable. I mean, if you can't be bothered to buy a $50 SSL certificate and install it, are you *really* trustworthy?

      LOL and here I thought all this time the Internet was supposed to reduce costs and barriers to competition... yet here we go "the higher the fewer".

      When your making the big bucks off Google by operating industrial scale link farms $50/year is a small price to pay for success.

      Someone please remind me again why we are even contemplating enriching the clusterfuck that is the CA industry which sees no problem with use of completely automated systems and non-existent documentation requirements prior to issuing certificates?

      • When you're making the big bucks off Google by operating industrial scale link farms $50/year is a small price to pay for success.

        Is it really? Industrial-scale link farming requires thousands of sites. At $50 each, that's going to start chewing into your profit margins.

        • They're not $50/ea though. ssls.com sells Comodo PositiveSSL for $8.95/yr. If they set up as a reseller and do enough volume, the price drops to $6.15/yr.

          If they aren't making enough to cover those costs (or even the $50), their link farms really aren't worth having.

    • Similarly, putting out the modicum of effort to perform basic security like SSL is a signal that the website is reputable. I mean, if you can't be bothered to buy a $50 SSL certificate and install it, are you *really* trustworthy?

      If you are relying on an SSL certificate to determine the trustworthiness of a site, you are doing it wrong. SSL certificates are cheap, and provide no additional trust worthiness to the site (unless they are an ecommerce site, which is a small part of the web). I would prefer sites that accept a username and password to use SSL, but I am okay with them not having it too. I would be a little bit worried about my password leaking when I am a public wifi. So those sites get a random password and a throw away

    • by Ichijo ( 607641 )

      Or if you can't be bothered to write compliant HTML. Oops, Google fails [w3.org].

      Sites with accessibility issues such as content that can only be accessed with JavaScript enabled, should also be deprioritized.

  • by Anonymous Coward

    Thanks to Google for making the web a little bit more secure by promoting secure websites!
    Thanks to the NSA for tapping the web so blindly and boldly than we should react!

    If the NSA was not so bold and had tapped only these who were under suspicion of bad behavior, the status-quo would have been kept. Now the privacy of everyone is a little bit more secure and the NSA will have a little bit harder times managing MITM attacks on every netizens.

    An EU Citizen who like its privacy.

  • Cat blog (Score:4, Insightful)

    by ZipK ( 1051658 ) on Thursday August 07, 2014 @11:35AM (#47623501)
    So my cat picture blog will rank lower than a competitor's SSL encrypted cat picture blog, even though neither of us require you to log in or even prove you are a cat?
    • by tepples ( 727027 )
      You need to log in to your cat picture blog to prove to the blog software that you have privileges to update your cat picture blog.
    • by satuon ( 1822492 )

      Yes, for news and such it doesn't make that much sense. Still, HTTPS would at least prevent your ISP from monitoring your browsing activity.

      • Re:Cat blog (Score:5, Informative)

        by Cyberdyne ( 104305 ) * on Thursday August 07, 2014 @12:09PM (#47623877) Journal

        Still, HTTPS would at least prevent your ISP from monitoring your browsing activity.

        That's part of it - a valuable enough part in itself, IMO; at least one UK ISP, TalkTalk, has started duplicating HTTP requests made by their customers [fluidinfo.com]: so, if you request http://example.com/stuff [example.com] on one of their lines, 30 seconds later they'll go and request the same URL themselves for monitoring purposes. Obviously, enabling SSL prevents this kind of gratuitous stupidity - and the previous incarnation of such snooping, Phorm [wikipedia.org]. If enough websites enable SSL, ISPs will no longer have the ability to monitor customer behavior that closely, all they will see are SSL flows to and from IP addresses, and whatever DNS queries you make to their servers, if any. (Use encrypted connections to OpenDNS or similar, and your ISP will only ever see IP addresses and traffic volume - exactly as it should be IMO!)

      • Re:Cat blog (Score:5, Informative)

        by IamTheRealMike ( 537420 ) on Thursday August 07, 2014 @01:38PM (#47624851)

        Yes, for news and such it doesn't make that much sense. Still, HTTPS would at least prevent your ISP from monitoring your browsing activity.

        It's actually a lot more than that. HTTPS isn't just about protecting passwords anymore, not post Snowden.

        Let us recall one of the more interesting things we learned about SSL via the NSA leaks: the Five Eyes countries apparently have not broken SSL yet despite that the internet is still not capable of stopping them. The reason is a system they've built called QUANTUM [wired.com].

        QUANTUM is a series of systems that work together. Imagine it like being a giant set of guard towers on the internet backbone. QUANTUM is called that because it's based on deep packet inspection and insertion. The first part is a massive set of DPI devices that trawl unencrypted internet traffic passing through intercept points. These DPI devices can be configured by NSA/GCHQ analysts to look for selectors - personal identifiers like email addresses, IP addresses, cookies and so on. QUANTUM does not run on every internet link and cannot see through encrypted traffic, but that doesn't matter: it's like a searchlight crawling the grounds of a prison at night. It doesn't matter that it can't light up everywhere simultaneously - once tasked it will keep searching until it finds you. Given enough time and good selectors, it will always find you, simply because the average internet user makes many different unencrypted connections to many different websites.

        Once QUANTUM locates an un-SSLd traffic stream that matches your selectors, the next step begins, this is called QUANTUM INSERT. You see these DPI devices are not only capable of reading traffic but also injecting packets directly onto the backbone as well. This allows them to race legitimate answers from the real servers, and redirect the victim to an entirely different server (this is probably based on racing DNS lookups although I think the leaked docs were fuzzy on this aspect). These races are called "shots" and interestingly, they don't always succeed - sometimes the NSA is slower than the real server. But QUANTUM keeps trying and eventually you end up connected to this new FOXACID server, which then proceeds to act as an HTTP proxy for the real request and injects an exploit kit. That then pwns your system such that the NSA can now see all your encrypted traffic, along with turning on your microphone and so on.

        An observant reader will notice something very important about the above description. The longer you can stay in the SSLd web, the longer it will take for QUANTUM to hack you. That means you directly benefit from a website being SSLd even if all it contains is cat pictures and you don't even log in. Once QUANTUM has figured out your IP address, any non-SSLd HTTP connection is a useful foothold.

    • by RobinH ( 124750 )
      HTTPS isn't about logging in, it's about encrypting the data between the server and the client with a one-time key. So it isn't about proving you're a cat, it's about preventing an eavesdropper from knowing which cat pictures you looked at (they still know you went to a cat picture blog).
    • This is not just about protecting login credentials. The idea is that if your cat blog is ranked highly, many people will click on it. For sites like that, a DNS or other redirection hack allows me to impersonate your site with some drive by or otherwise downloadable malware.

      TLS is about trusting the site your connecting with to actually be the site you think it is. So if your cat blog had a valid TLS certificate, then the impersonating site would need to obtain a fake certificate to avoid the client displa

  • by Cthefuture ( 665326 ) on Thursday August 07, 2014 @11:42AM (#47623569)

    I have no technical problem switching every website/server I have to SSL but the actual problem is the price of all those SSL certs. Most of my sites are just hobby type sites that I run for my own enjoyment and to benefit others (quite a few "others" I should mention; some of my sites are very popular). However, I don't make any money off these, in fact it already costs me money to run them.

    Now you want me to add SSL so that people can still find my relevant and useful information? Well, OK but how the hell am I suppose to pay for it? SSL server certs are expensive. The whole thing is a scam to make the few "official" CA's rich. How about some sort of official public service that can hand out server certs of every registered domain? Every domain should come with an unlimited supply of SSL certs or at least a wildcard cert and a renewal service, free of charge.

    • StartSSL or DANE (Score:3, Informative)

      by tepples ( 727027 )

      How about some sort of official public service that can hand out server certs of every registered domain?

      You mean like StartSSL? Or what about DANE [wikipedia.org], which stores TLS certificates in DNSSEC?

      • Yeah, none of those work in any popular browser out of the box.

        • StartSSL and SNI work out of the box for the majority. Or if this site [pineight.com] gives you certificate errors, which browser are you using?
        • by heypete ( 60671 )

          Quite the contrary: StartSSL is accepted by every major browser and SSL/TLS library, and has been for years.

          Well-known sites, like EFF.org [eff.org], LibreOffice [libreoffice.org], and others use StartSSL-issued certs and don't have any issues. Sure, they're not Google-sized sites, but they're fairly major.

      • You mean like StartSSL?

        Hardly an official service, just a commerical CA that hands out freebies to some but not all sites that ask for them and puts technical restritions on those freebies which push people to either buy the commercial products or spend more on hosting (do I pay for n extra n IPv4 addresses or do I pay for a wildcard cert).

        Or what about DANE [wikipedia.org], which stores TLS certificates in DNSSEC?

        Sadly not implemented anywhere near widely enough to be useful.

    • SSL DNS certs are not expensive. You can get them for free (as pointed out) or for perhaps $20 per year. Your hosting costs are almost certainly higher than that.

    • by RobinH ( 124750 ) on Thursday August 07, 2014 @02:29PM (#47625363) Homepage
      Agreed, if Google wants to do this, maybe they should also become a free Certificate Authority. Wouldn't that tick off the Verisigns of the world...
    • by rdnetto ( 955205 )

      So far all Google has said is that they will boost sites which use HTTPS - as far as I can tell, they haven't said anything about requiring the use of a trusted CA.
      Self-signed certs are free, and just as (if not more) effective than the paid ones if your goal is to prevent eavesdropping and not to verify the identity of an unknown server. (Known servers can be reasonably expected to use the same certificate as last time, or at least the same CA).

      Given that the centralised CA model seems to have largely fail

      • The entire point of a CA is trust. Using a non-trusted CA would actually be a step backwards. Even worse would be convincing people that manually installing a cert for a random website is a good idea.

        Besides, I do believe that every single major browser now includes dire warnings if you go to a site with a cert from a non-trusted source.

        Certs are cheap. A quick Googling reveals a number of options [about.com] for under $50/year

        • by rdnetto ( 955205 )

          The entire point of a CA is trust.

          Agreed. But SSL is about encryption - authentication is merely an optional extra (if it weren't, self-signed certs wouldn't even be an option).
          No intelligent person trusts the majority of websites, but they may still have valid reasons for not wanting their browsing habits eavesdropped upon.

          Using a non-trusted CA would actually be a step backwards.

          That depends on your priorities - on whether authentication or privacy is more important to you. Quite frankly, I find it hard to understand how encryption without authentication is worse than no authentication at all.

          Even worse would be convincing people that manually installing a cert for a random website is a good idea.

          Besides, I do believe that every single major browser now includes dire warnings if you go to a site with a cert from a non-trusted source.

          Fr

  • It's ironic that I'm hearing about this story on Slashdot, a site that has so far refused any sort of security. Good luck on your page ranks Slashdot.

  • Isn't it unfair to rank content based on encryption rather than relevance? What do you say?

    I myself would promote HTTPS sites with a golden lock symbol and a text "This site transmits your data confidentially".

  • If it 's just an info site or something and you don't submit any confidential information, why do you need https? If I own a pizza shop, my website is all get requests, and I'm not worried about third parties seeing what's on my menu, why should I have to buy an SSL certificate? This seems like overkill to me.

    • by smartr ( 1035324 )

      hear hear! Sure, encryption is great and has its uses... But also comes at the cost of processing, configuration, maintenance, and low cost 3rd party providers. GoDaddy is about a to get a shitload of extra customers. When the products in the market are comprable, the well known low cost one is frequently the winner. Thanks Google.

  • http://www.arthur-schiwon.de/w... [arthur-schiwon.de]

    Google forces weakest shittiest broken RC4 crypto and calls it secure.

  • My site uses regular http for the "brochure" like main page and info pages (e.g. FAQs, how-tos), and uses https for the login pages and software-as-a-service web-app pages.

    Is there something wrong, conceptually, with doing it that way?

    Is that hybrid approach going to lower my ranking?

    Not sure why one would go to https (and more intensive server-side processing) on the brochure and FAQ type pages.

    • Really, it's not more intensive to use https. There are lots of people who have analyzed the difference.

      A few more packets are sent. It's really trivial. While it is measurable, it can be recovered by removing one little picture, and/or compressing one of those pictures.

      A trivial amount of CPU time is taken. Most of the measurements saying it was significant was when CPUs were single core 200Mhz or less, and memory was measured in MB rather than GB.

      I've been offering or forcing users to SSL, depending

Aren't you glad you're not getting all the government you pay for now?

Working...