Reader AmiMoJo writes: In comments submitted to a U.S. Copyright Office consultation, Google has given the DMCA a vote of support, despite widespread abuse. Noting that the law allows for innovation and agreements with content creators, Google says that 99.95% of URLs it was asked to take down last month didn't even exist in its search indexes. "For example, in January 2017, the most prolific submitter submitted notices that Google honored for 16,457,433 URLs. But on further inspection, 16,450,129 (99.97%) of those URLs were not in our search index in the first place."
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×
Google and Microsoft pledged on Monday to crack down on sites hosting pirated content that show up on their search engines. In what is being called a first of its kind agreement, Google and Microsoft's Bing will demote U.K. search results of copyright infringing websites. From a report on The Telegraph: The search engine operators have signed up to a clampdown that will see the UK's copyright watchdog monitor the search results they provide for unlawful websites. The agreement follows years of campaigning by record labels and film studios, which have accused Google and Microsoft of turning a blind eye to piracy and dragging their feet over measures to protect copyright online. Under a new voluntary code, the tech giants have committed to demote websites that have repeatedly been served with copyright infringement notices, so that they do not appear on the first page for common searches.
An anonymous reader writes: "Attacks on WordPress sites using a vulnerability in the REST API, patched in WordPress version 4.7.2, have intensified over the past two days, as attackers have now defaced over 1.5 million pages, spread across 39,000 unique domains," reports BleepingComputer. "Initial attacks using the WordPress REST API flaw were reported on Monday by web security firm Sucuri, who said four groups of attackers defaced over 67,000 pages. The number grew to over 100,000 pages the next day, but according to a report from fellow web security firm WordFence, these numbers have skyrocketed today to over 1.5 million pages, as there are now 20 hacking groups involved in a defacement turf war." Making matters worse, over the weekend Google's Search Console service, formerly known as Google Webmaster, was sending out security alerts to people it shouldn't. Google attempted to send security alerts to all WordPress 4.7.0 and 4.7.1 website owners (vulnerable to the REST API flaw), but some emails reached WordPress 4.7.2 owners. Some of which misinterpreted the email and panicked, fearing their site might lose search engine ranking.
An anonymous reader quotes a report from Reuters: The U.S. House of Representatives voted on Monday to require law enforcement authorities to obtain a search warrant before seeking old emails from technology companies, a win for privacy advocates fearful the Trump administration may work to expand government surveillance powers. The House passed the measure by a voice vote. But the legislation was expected to encounter resistance in the Senate, where it failed to advance last year amid opposition by a handful of Republican lawmakers after the House passed it unanimously. Currently, agencies such as the Justice Department and the Securities and Exchange Commission only need a subpoena to seek such data from a service provider.
tomhath quotes a report from TechCrunch: Mark Zuckerberg and Priscilla Chan's $45 billion philanthropy organization is making its first acquisition in order to make it easier for scientists to search, read and tie together more than 26 million science research papers. The Chan Zuckerberg Initiative is acquiring Meta, an AI-powered research search engine startup, and will make its tool free to all in a few months after enhancing the product. Meta's AI recognizes authors and citations between papers so it can surface the most important research instead of just what has the best SEO. It also provides free full-text access to 18,000 journals and literature sources. Meta co-founder and CEO Sam Molyneux writes that "Going forward, our intent is not to profit from Meta's data and capabilities; instead we aim to ensure they get to those who need them most, across sectors and as quickly as possible, for the benefit of the world."
An anonymous reader shares a BetaNews article: A lot of people are more privacy aware than they have been in the past, and are wary of entrusting everything they search for to Google. That's where privacy-focused sites like DuckDuckGo come in. Its growth since it launched 8 years ago has been nothing short of staggering, with the number of searches skyrocketing since 2013, when Edward Snowden first revealed how the US government was spying on its people. The search site says it has to date served up over 10 billion anonymous searches, with 4 billion of those occurring in the last year alone, and the company says it is growing faster than ever. On January 10 2017, the site received in excess of 14 million private searches.
Ads for Google and related companies were found in the top spot in 91% of 25,000 searches related to items, according to a report on WSJ. For example, a search for "phones" would produce ads for Google Pixel, which the company launched last year. From a report: Similar results were found for searches on "Watches" or "smoke detector," which produced ads for Android smartwatches and Nest devices, respectively. In a statement, Google says their marketing programs are "carefully designed" to not impact outside advertisers. "All our bids are excluded from the auction when determining the price paid by other advertisers, and we have strict rules and processes -- set to tougher levels than our customers -- to govern the use of our own ads products." The auction is a process deciding which ads will appear for users when they type in certain search queries. Strategies such as using relevant keywords give advertisers a better shot at their ad appearing on a search results page.
Kyle Schreiber has raised some issues about Google's AMP (Accelerated Mobile Pages), an open source project unveiled by the company in 2015 with which it aims to accelerate content on mobile devices. He writes on his blog: The largest complaint by far is that the URLs for AMP links differ from the canonical URLs for the same content, making sharing difficult. The current URLs are a mess. They all begin with some form of https://wwww.google.com/amp/ before showing a URL to the AMP version of the site. There is currently no way to find the canonical link to the page without guessing what the original URL is. This usually involves removing either a .amp or ?amp=1 from the URL to get to the actual page. Make no mistake. AMP is about lock-in for Google. AMP is meant to keep publishers tied to Google. Clicking on an AMP link feels like you never even leave the search page, and links to AMP content are displayed prominently in Google's news carousel. This is their response to similar formats from both Facebook and Apple, both of which are designed to keep users within their respective ecosystems. However, Google's implementation of AMP is more broad and far reaching than the Apple and Facebook equivalents. Google's implementation of AMP is on the open web and isn't limited to just an app like Facebook or Apple.
Google is rolling out an update for its Android app that makes it easier to search on the web with an inconsistent internet connection. Users can make searches when offline and the Google app will store them, delivering the results later (with an optional notification) when the devices get signal again. From a report: As Google product manager Shekhar Sharad writes in a blog post: "So the next time you lose service, feel free to queue up your searches, put your phone away and carry on with your day. The Google app will work behind-the-scenes to detect when a connection is available again and deliver your search results once completed."
msm1267 writes: Last week Box.com moved quickly and quietly to block search engines from indexing links to confidential data owned by its users. That is after security researcher Markus Neis surfaced private data belonging to a number of Fortune 500 companies via Google, Bing and other search engines. Box.com said it's a classic case of users accidentally oversharing. Neis isn't convinced and says Box.com's so-called Collaboration links shouldn't have been indexed in the first place. Box.com has since blocked access to what security researchers say was a treasure trove of confidential data and fodder for phishing scams.
In the past few years, Google has used its so-called "knowledge graph" to make search results far more useful than just a list of links -- you can get lots of info on a variety of topics right in Google without having to click on any search results. The latest addition to Google search is something foodies should take note of. Now, when you search for food on mobile, you'll see a carousel of recipes at the top of the results page. From a report on Engadget: Google also added some filters to those recipe results -- right below the search bar are additional suggestions you can use to refine your results. Searching for "fried chicken" gave me the option to add "oven-fried," "buttermilk," and "southern fried" filters to narrow down the recipes. You can also tap "view all" to move out of the standard search page and see bigger, more detailed recipe cards that show a picture and quick preview of the recipe.
Google says it is "thinking deeply" about ways to improve search, after criticism over how some results -- including ones discussing the Holocaust -- were ranked. From a report on BBC: Searching for "did the Holocaust happen?" returned a top result that claimed it did not, as Guardian journalist Carole Cadwalladr reported. Now, the ranking has changed for US users. The page -- from white supremacist site Stormfront -- remains top in the UK. "This is a really challenging problem, and something we're thinking deeply about in terms of how we can do a better job," said a Google spokesman. "Search is a reflection of the content that exists on the web. The fact that hate sites may appear in search results in no way means that Google endorses these views."
Google has confirmed to Search Engine Land that it is testing a feature allowing users to rate movies or TV shows directly in the search results interface. "We're currently experimenting with the feature but have nothing to announce at this time," a Google spokesperson said. TechCrunch reports: Unlike other movie and TV rating platforms, Google's feature is not on a scale from one to five but instead offers a binary choice: like or dislike. Information about weather, ticket purchasing options and more used to be available on unique, individual websites. Today, however, Google has incorporated this information and functionality into the search results layer of its own service. Within the movie ratings feature, users will also be able to see the Rotten Tomatoes and IMDb ratings for the title, as they always have. You can view a screenshot of the rating system here.
Google today confirmed that it is removing "In the news" section from the top of desktop search, and replacing it with a carousel of "Top stories," similar to what exists on mobile. From a new report on BusinessInsider: This move had been planned for quite some time, and is being rolled out globally, according to Google. The removal of the word "news" will, hopefully, help draw a sharper line between Google's human-vetted Google News product, and its main search product. Last month, Google faced scrutiny when one of its top results for "final election count" was fake news. The top result in Google Search's "In the news" section was a Wordpress blog named "70 News," which falsely claimed Trump won the popular vote by a margin of almost 700,000. (He didnâ(TM)t). Google's search results, in contrast to Google News, are not assessed for "truth."
Copyright holders asked Google to remove more than 1,000,000,000 allegedly infringing links from its search engine over the past twelve months, TorrentFreak reports. According to stats provided in Google's Transparency Report for the past one year, Google was asked to remove over one billion links -- or 1,007,741,143 links. From the article: More than 90 percent of the links, 908,237,861 were in fact removed. The rest of the reported links were rejected because they were invalid, not infringing, or duplicates of earlier requests. In total, Google has now processed just over two billion allegedly infringing URLs from 945,000 different domains. That the second billion took only a year, compared to several years for the first, shows how rapidly the volume of takedown requests is expanding. At the current rate, another billion will be added by the end of next summer. Most requests, over 50 million, were sent in for the website 4shared.com. However, according to the site's operators many of the reported URLs point to the same files, inflating the actual volume of infringing content.