Privacy

FTC Says 23andMe Purchaser Must Uphold Existing Privacy Policy For Data Handling (therecord.media) 28

The FTC has warned that any buyer of 23andMe must honor the company's current privacy policy, which ensures consumers retain control over their genetic data and can delete it at will. FTC Chair Andrew Ferguson emphasized that such promises must be upheld, given the uniquely sensitive and immutable nature of genetic information. The Record reports: The letter, sent to the DOJ's United States Trustee Program, highlights several assurances 23andMe makes in its privacy policy, including that users are in control of their data and can determine how and for what purposes it is used. The company also gives users the ability to delete their data at will, the letter says, arguing that 23andMe has made "direct representations" to consumers about how it uses, shares and safeguards their personal information, including in the case of bankruptcy.

Pointing to statements that the company's leadership has made asserting that user data should be considered an asset, Ferguson highlighted that 23andMe's privacy statement tells users it does not share their data with insurers, employers, public databases or law enforcement without a court order, search warrant or subpoena. It also promises consumers that it only shares their personal data in cases where it is needed to provide services, Ferguson added. The genetic testing and ancestry company is explicit that its data protection guidelines apply to new entities it may be sold or transferred to, Ferguson said.

Privacy

UK's GCHQ Intern Transferred Top Secret Files To His Phone (bbc.co.uk) 51

Bruce66423 shares a report from the BBC: A former GCHQ intern has admitted risking national security by taking top secret data home with him on his mobile phone. Hasaan Arshad, 25, pleaded guilty to an offence under the Computer Misuse Act on what would have been the first day of his trial at the Old Bailey in London. The charge related to committing an unauthorised act which risked damaging national security.

Arshad, from Rochdale in Greater Manchester, is said to have transferred sensitive data from a secure computer to his phone, which he had taken into a top secret area of GCHQ on 24 August 2022. [...] The court heard that Arshad took his work mobile into a top secret GCHQ area and connected it to work station. He then transferred sensitive data from a secure, top secret computer to the phone before taking it home, it was claimed. Arshad then transferred the data from the phone to a hard drive connected to his personal home computer.
"Seriously? What on earth was the UK's equivalent of the NSA doing allowing its hardware to carry out such a transfer?" questions Bruce66423.
Biotech

Open Source Genetic Database Shuts Down To Protect Users From 'Authoritarian Governments' (404media.co) 28

An anonymous reader quotes a report from 404 Media: The creator of an open source genetic database is shutting it down and deleting all of its data because he has come to believe that its existence is dangerous with "a rise in far-right and other authoritarian governments" in the United States and elsewhere. "The largest use case for DTC genetic data was not biomedical research or research in big pharma," Bastian Greshake Tzovaras, the founder of OpenSNP, wrote in a blog post. "Instead, the transformative impact of the data came to fruition among law enforcement agencies, who have put the genealogical properties of genetic data to use."

OpenSNP has collected roughly 7,500 genomes over the last 14 years, primarily by allowing people to voluntarily submit their own genetic information they have downloaded from 23andMe. With the bankruptcy of 23andMe, increased interest in genetic data by law enforcement, and the return of Donald Trump and rise of authoritarian governments worldwide, Greshake Tzovaras told 404 Media he no longer believes it is ethical to run the database. "I've been thinking about it since 23andMe was on the verge of bankruptcy and been really considering it since the U.S. election. It definitely is really bad over there [in the United States]," Greshake Tzovaras told 404 Media. "I am quite relieved to have made the decision and come to a conclusion. It's been weighing on my mind for a long time."

Greshake Tzovaras said that he is proud of the OpenSNP project, but that, in a world where scientific data is being censored and deleted and where the Trump administration has focused on criminalizing immigrants and trans people, he now believes that the most responsible thing to do is to delete the data and shut down the project. "Most people in OpenSNP may not be at particular risk right now, but there are people from vulnerable populations in here as well," Greshake Tzovaras said. "Thinking about gender representation, minorities, sexual orientation -- 23andMe has been working on the whole 'gay gene' thing, it's conceivable that this would at some point in the future become an issue."
"Across the globe there is a rise in far-right and other authoritarian governments. While they are cracking down on free and open societies, they are also dedicated to replacing scientific thought and reasoning with pseudoscience across disciplines," Greshake Tzovaras wrote. "The risk/benefit calculus of providing free & open access to individual genetic data in 2025 is very different compared to 14 years ago. And so, sunsetting openSNP -- along with deleting the data stored within it -- feels like it is the most responsible act of stewardship for these data today."

"The interesting thing to me is there are data preservation efforts in the U.S. because the government is deleting scientific data that they don't like. This is approaching that same problem from a different direction," he added. "We need to protect the people in this database. I am supportive of preserving scientific data and knowledge, but the data comes second -- the people come first. We prefer deleting the data."
Apple

Apple Fined $162 Million for App Privacy System That Harms Developers (yahoo.com) 18

France's competition authority has fined Apple 150 million euros ($162 million) for abusing its market dominance through its App Tracking Transparency system, ruling the privacy initiative unfairly disadvantages app developers. The watchdog determined that requiring third-party developers to use two pop-ups for tracking permissions while Apple's own apps need just one tap creates an "excessively complex" process that particularly harms smaller publishers lacking sufficient proprietary data for alternative targeting.

The authority acknowledged the system's privacy benefits, but concluded the framework is "neither necessary nor proportionate" with data protection goals. The regulator is not requiring Apple to modify the system, only imposing the fine for past practices. Apple must display a summary of the decision on its website for seven days.
Privacy

FBI Raids Home of Prominent Computer Scientist Who Has Gone Incommunicado (arstechnica.com) 100

An anonymous reader shares a report: A prominent computer scientist who has spent 20 years publishing academic papers on cryptography, privacy, and cybersecurity has gone incommunicado, had his professor profile, email account, and phone number removed by his employer, Indiana University, and had his homes raided by the FBI. No one knows why.

Xiaofeng Wang has a long list of prestigious titles. He was the associate dean for research at Indiana University's Luddy School of Informatics, Computing and Engineering, a fellow at the Institute of Electrical and Electronics Engineers and the American Association for the Advancement of Science, and a tenured professor at Indiana University at Bloomington. According to his employer, he has served as principal investigator on research projects totaling nearly $23 million over his 21 years there.

He has also co-authored scores of academic papers on a diverse range of research fields, including cryptography, systems security, and data privacy, including the protection of human genomic data.

Privacy

Nearly 1.5 Million Private Photos from Five Dating Apps Were Exposed Online (bbc.com) 32

"Researchers have discovered nearly 1.5 million pictures from specialist dating apps — many of which are explicit — being stored online without password protection," reports the BBC, "leaving them vulnerable to hackers and extortionists."

And the images weren't limited to those from profiles, the BBC learned from the ethical hacker who discovered the issue. "They included pictures which had been sent privately in messages, and even some which had been removed by moderators..." Anyone with the link was able to view the private photos from five platforms developed by M.A.D Mobile [including two kink/BDSM sites and two LGBT apps]... These services are used by an estimated 800,000 to 900,000 people.

M.A.D Mobile was first warned about the security flaw on 20th January but didn't take action until the BBC emailed on Friday. They have since fixed it but not said how it happened or why they failed to protect the sensitive images. Ethical hacker Aras Nazarovas from Cybernews first alerted the firm about the security hole after finding the location of the online storage used by the apps by analysing the code that powers the services...

None of the text content of private messages was found to be stored in this way and the images are not labelled with user names or real names, which would make crafting targeted attacks at users more complex.

In an email M.A.D Mobile said it was grateful to the researcher for uncovering the vulnerability in the apps to prevent a data breach from occurring. But there's no guarantee that Mr Nazarovas was the only hacker to have found the image stash.

"Mr Nazarovas and his team decided to raise the alarm on Thursday while the issue was still live as they were concerned the company was not doing anything to fix it..."
Privacy

Madison Square Garden Bans Fan After Surveillance System IDs Him as Critic of Its CEO (theverge.com) 99

An anonymous reader quotes a report from The Verge: A concert on Monday night at New York's Radio City Music Hall was a special occasion for Frank Miller: his parents' wedding anniversary. He didn't end up seeing the show -- and before he could even get past security, he was informed that he was in fact banned for life from the venue and all other properties owned by Madison Square Garden (MSG). After scanning his ticket and promptly being pulled aside by security, Miller was told by staff that he was barred from the MSG properties for an incident at the Garden in 2021. But Miller says he hasn't been to the venue in nearly two decades.

"They hand me a piece of paper letting me know that I've been added to a ban list," Miller says. "There's a trespass notice if I ever show up on any MSG property ever again," which includes venues like Radio City, the Beacon Theatre, the Sphere, and the Chicago Theatre. He was baffled at first. Then it dawned on him: this was probably about a T-shirt he designed years ago. MSG Entertainment won't say what happened with Miller or how he was picked out of the crowd, but he suspects he was identified via controversial facial recognition systems that the company deploys at its venues.

In 2017, 1990s New York Knicks star Charles Oakley was forcibly removed from his seat near Knicks owner and Madison Square Garden CEO James Dolan. The high-profile incident later spiraled into an ongoing legal battle. For Miller, Oakley was an "integral" part of the '90s Knicks, he says. With his background in graphic design, he made a shirt in the style of the old team logo that read, "Ban Dolan" -- a reference to the infamous scuffle. A few years later, in 2021, a friend of Miller's wore a Ban Dolan shirt to a Knicks game and was kicked out and banned from future events. That incident spawned ESPN segments and news articles and validated what many fans saw as a pettiness on Dolan and MSG's part for going after individual fans who criticized team ownership.
"Frank Miller Jr. made threats against an MSG executive on social media and produced and sold merchandise that was offensive in nature," Mikyl Cordova, executive vice president of communications and marketing for the company, said in an emailed statement. "His behavior was disrespectful and disruptive and in violation of our code of conduct."

Miller responded to the ban, saying: "I just found it comical, until I was told that my mom was crying [in the lobby]. I was like, 'Oh man, I ruined their anniversary with my shit talk on the internet. Memes are powerful, and so is the surveillance state. It's something that we all have to be aware of -- the panopticon. We're [being] surveilled at all times, and it's always framed as a safety thing, when rarely is that the case. It's more of a deterrent and a fear tactic to try to keep people in line."
Television

Smart TVs Are Employing Screen Monitoring Tech To Harvest User Data (vox.com) 44

Smart TV platforms are increasingly monitoring what appears on users' screens through Automatic Content Recognition (ACR) technology, building detailed viewer profiles for targeted advertising.

Roku, which transitioned from a hardware company to an advertising powerhouse, reported $3.5 billion in annual ad revenue for 2024 -- representing 85% of its total income. The company has aggressively acquired ACR-related firms, with Roku-owned technology winning an Emmy in 2023 for advancements in the field.

According to market research firm Antenna, 43% of all streaming subscriptions in the United States were ad-supported by late 2024, showing the industry's shift toward advertising-based models. Most users unknowingly consent to this monitoring when setting up their devices. Though consumers can technically disable ACR in their TV settings, doing so often restricts functionality.
Privacy

Again and Again, NSO Group's Customers Keep Getting Their Spyware Operations Caught (techcrunch.com) 8

An anonymous reader shares a report: Amnesty International published a new report this week detailing attempted hacks against two Serbian journalists, allegedly carried out with NSO Group's spyware Pegasus. The two journalists, who work for the Serbia-based Balkan Investigative Reporting Network (BIRN), received suspicious text messages including a link -- basically a phishing attack, according to the nonprofit. In one case, Amnesty said its researchers were able to click on the link in a safe environment and see that it led to a domain that they had previously identified as belonging to NSO Group's infrastructure.

"Amnesty International has spent years tracking NSO Group Pegasus spyware and how it has been used to target activists and journalists," Donncha O Cearbhaill, the head of Amnesty's Security Lab, told TechCrunch. "This technical research has allowed Amnesty to identify malicious websites used to deliver the Pegasus spyware, including the specific Pegasus domain used in this campaign."

To his point, security researchers like O Cearbhaill who have been keeping tabs on NSO's activities for years are now so good at spotting signs of the company's spyware that sometimes all researchers have to do is quickly look at a domain involved in an attack. In other words, NSO Group and its customers are losing their battle to stay in the shadows. "NSO has a basic problem: They are not as good at hiding as their customers think," John Scott-Railton, a senior researcher at The Citizen Lab, a human rights organization that has investigated spyware abuses since 2012, told TechCrunch.

Privacy

Oracle Customers Confirm Data Stolen In Alleged Cloud Breach Is Valid (bleepingcomputer.com) 20

An anonymous reader quotes a report from BleepingComputer: Despite Oracle denying a breach of its Oracle Cloud federated SSO login servers and the theft of account data for 6 million people, BleepingComputer has confirmed with multiple companies that associated data samples shared by the threat actor are valid. Last week, a person named 'rose87168' claimed to have breached Oracle Cloud servers and began selling the alleged authentication data and encrypted passwords of 6 million users. The threat actor also said that stolen SSO and LDAP passwords could be decrypted using the info in the stolen files and offered to share some of the data with anyone who could help recover them.

The threat actor released multiple text files consisting of a database, LDAP data, and a list of 140,621 domains for companies and government agencies that were allegedly impacted by the breach. It should be noted that some of the company domains look like tests, and there are multiple domains per company. In addition to the data, rose87168 shared an Archive.org URL with BleepingComputer for a text file hosted on the "login.us2.oraclecloud.com" server that contained their email address. This file indicates that the threat actor could create files on Oracle's server, indicating an actual breach. However, Oracle has denied that it suffered a breach of Oracle Cloud and has refused to respond to any further questions about the incident.

"There has been no breach of Oracle Cloud. The published credentials are not for the Oracle Cloud. No Oracle Cloud customers experienced a breach or lost any data," the company told BleepingComputer last Friday. This denial, however, contradicts findings from BleepingComputer, which received additional samples of the leaked data from the threat actor and contacted the associated companies. Representatives from these companies, all who agreed to confirm the data under the promise of anonymity, confirmed the authenticity of the information. The companies stated that the associated LDAP display names, email addresses, given names, and other identifying information were all correct and belonged to them. The threat actor also shared emails with BleepingComputer, claiming to be part of an exchange between them and Oracle.

United Kingdom

UK's First Permanent Facial Recognition Cameras Installed (theregister.com) 55

The Metropolitan Police has confirmed its first permanent installation of live facial recognition (LFR) cameras is coming this summer and the location will be the South London suburb of Croydon. From a report: The two cameras will be installed in the city center in an effort to combat crime and will be attached to buildings and lamp posts on North End and London Road. According to the police they will only be turned on when officers are in the area and in a position to make an arrest if a criminal is spotted. The installation follows a two-year trial in the area where police vans fitted with the camera have been patrolling the streets matching passersby to its database of suspects or criminals, leading to hundreds of arrests. The Met claims the system can alert them in seconds if a wanted wrong'un is spotted, and if the person gets the all-clear, the image of their face will be deleted.
Encryption

Signal President Blasts WhatsApp's Privacy Claims (cybernews.com) 59

Signal president Meredith Whittaker challenged recent assertions by WhatsApp head Will Cathcart that minimal differences exist between the two messaging platforms' privacy protections. "We're amused to see WhatsApp stretching the limits of reality to claim that they are just like Signal," Whittaker said in a statement published Monday, responding to Cathcart's comments to Dutch journalists last week.

While WhatsApp licenses Signal's end-to-end encryption technology, Whittaker said that WhatsApp still collects substantial user metadata, including "location data, contact lists, when they send someone a message, when they stop, what users are in their group chats, their profile picture, and much more." Cathcart had previously stated that WhatsApp doesn't track users' communications or share contact information with other companies, claiming "we strongly believe in private communication."
Privacy

Signal Head Defends Messaging App's Security After US War Plan Leak (yahoo.com) 161

The president of Signal defended the messaging app's security on Wednesday after top Trump administration officials mistakenly included a journalist in an encrypted chatroom they used to discuss looming U.S. military action against Yemen's Houthis. For a report: Signal's Meredith Whittaker did not directly address the blunder, which Democratic lawmakers have said was a breach of U.S. national security. But she described the app as the "gold standard in private comms" in a post on X, which outlined Signal's security advantages over Meta's WhatsApp messaging app. "We're open source, nonprofit, and we develop and apply (end-to-end encryption) and privacy-preserving tech across our system to protect metadata and message contents," she said.
AI

Apple Says It'll Use Apple Maps Look Around Photos To Train AI (theverge.com) 11

An anonymous reader shares a report: Sometime earlier this month, Apple updated a section of its website that discloses how it collects and uses imagery for Apple Maps' Look Around feature, which is similar to Google Maps' Street View, as spotted by 9to5Mac. A newly added paragraph reveals that, beginning in March 2025, Apple will be using imagery and data collected during Look Around surveys to "train models powering Apple products and services, including models related to image recognition, creation, and enhancement."

Apple collects images and 3D data to enhance and improve Apple Maps using vehicles and backpacks (for pedestrian-only areas) equipped with cameras, sensors, and other equipment including iPhones and iPads. The company says that as part of its commitment to privacy, any images it captures that are published in the Look Around feature have faces and license plates blurred. Apple also says it will only use imagery with those details blurred out for training models. It does accept requests for those wanting their houses to also be blurred, but by default they are not.

Biotech

DNA of 15 Million People For Sale In 23andMe Bankruptcy (404media.co) 51

An anonymous reader quotes a report from 404 Media: 23andMe filed for Chapter 11 bankruptcy Sunday, leaving the fate of millions of people's genetic information up in the air as the company deals with the legal and financial fallout of not properly protecting that genetic information in the first place. The filing shows how dangerous it is to provide your DNA directly to a large, for-profit commercial genetic database; 23andMe is now looking for a buyer to pull it out of bankruptcy. 23andMe said in court documents viewed by 404 Media that since hackers obtained personal data about seven million of its customers in October 2023, including, in some cases "health-related information based upon the user's genetics," it has faced "over 50 class action and state court lawsuits," and that "approximately 35,000 claimants have initiated, filed, or threatened to commence arbitration claims against the company." It is seeking bankruptcy protection in part to simplify the fallout of these legal cases, and because it believes it may not have money to pay for the potential damages associated with these cases.

CEO and cofounder Anne Wojcicki announced she is leaving the company as part of this process. The company has the genetic data of more than 15 million customers. According to its Chapter 11 filing, 23andMe owes money to a host of pharmaceutical companies, pharmacies, artificial intelligence companies (including a company called Aganitha AI and Coreweave), as well as health insurance companies and marketing companies.
Shortly before the filing, California Attorney General Rob Bonta issued an "urgent" alert to 23andMe customers: "Given 23andMe's reported financial distress, I remind Californians to consider invoking their rights and directing 23andMe to delete their data and destroy any samples of genetic material held by the company."

In a letter to customers Sunday, 23andMe said: "Your data remains protected. The Chapter 11 filing does not change how we store, manage, or protect customer data. Our users' privacy and data are important considerations in any transaction, and we remain committed to our users' privacy and to being transparent with our customers about how their data is managed." It added that any buyer will have to "comply with applicable law with respect to the treatment of customer data."

404 Media's Jason Koebler notes that "there's no way of knowing who is going to buy it, why they will be interested, and what will become of its millions of customers' DNA sequences. 23andMe has claimed over the years that it strongly resists law enforcement requests for information and that it takes customer security seriously. But the company has in recent years changed its terms of service, partnered with big pharmaceutical companies, and, of course, was hacked."
Google

Google Says It Might Have Deleted Your Maps Timeline Data (arstechnica.com) 14

Google has confirmed that a technical issue has permanently deleted location history data for numerous users of its Maps application, with no recovery possible for most affected customers. The problem emerged after Google transitioned its Timeline feature from cloud to on-device storage in 2024 to enhance privacy protections. Users began reporting missing historical location data on support forums and social media platforms in recent weeks. "This is the result of a technical issue and not user error or an intentional change," said a Google spokesperson. Only users who manually enabled encrypted cloud backups before the incident can recover their data, according to Google. The company began shifting location storage policies in 2023, initially stopping collection of sensitive location data including visits to abortion clinics and domestic violence shelters.
China

China Bans Compulsory Facial Recognition and Its Use in Private Spaces Like Hotel Rooms (theregister.com) 28

China's Cyberspace Administration and Ministry of Public Security have outlawed the use of facial recognition without consent. From a report: The two orgs last Friday published new rules on facial recognition and an explainer that spell out how orgs that want to use facial recognition must first conduct a "personal information protection impact assessment" that considers whether using the tech is necessary, impacts on individuals' privacy, and risks of data leakage. Organizations that decide to use facial recognition must data encrypt biometric data, and audit the information security techniques and practices they use to protect facial scans. Chinese that go through that process and decide they want to use facial recognition can only do so after securing individuals' consent. The rules also ban the use of facial recognition equipment in public places such as hotel rooms, public bathrooms, public dressing rooms, and public toilets. The measures don't apply to researchers or to what machine translation of the rules describes as "algorithm training activities" -- suggesting images of citizens' faces are fair game when used to train AI models.
EU

Is WhatsApp Being Ditched for Signal in Dutch Higher Education? (dub.uu.nl) 42

For weeks Signal has been one of the three most-downloaded apps in the Netherlands, according to a local news site. And now "Higher education institutions in the Netherlands have been looking for an alternative," according to DUB (an independent news site for the Utrecht University community): Employees of the Utrecht University of Applied Sciences (HU) were recently advised to switch to Signal. Avans University of Applied Sciences has also been discussing a switch...The National Student Union is concerned about privacy. The subject was raised at last week's general meeting, as reported by chair Abdelkader Karbache, who said: "Our local unions want to switch to Signal or other open-source software."
Besides being open source, Signal is a non-commercial nonprofit, the article points out — though its proponents suggest there's another big difference. "HU argues that Signal keeps users' data private, unlike WhatsApp." Cybernews.com explains the concern: In an interview with the Dutch newspaper De Telegraaf, Meredith Whittaker [president of the Signal Foundation] discussed the pitfalls of WhatsApp. "WhatsApp collects metadata: who you send messages to, when, and how often. That's incredibly sensitive information," she says.... The only information [Signal] collects is the date an account was registered, the time when an account was last active, and hashed phone numbers... Information like profile name and the people a user communicates with is all encrypted... Metadata might sound harmless, but it couldn't be further from the truth. According to Whittaker, metadata is deadly. "As a former CIA director once said: 'We kill people based on metadata'."
WhatsApp's metadata also includes IP addresses, TechRadar noted last May: Other identifiable data such as your network details, the browser you use, ISP, and other identifiers linked to other Meta products (like Instagram and Facebook) associated with the same device or account are also collected... [Y]our IP can be used to track down your location. As the company explained, even if you keep the location-related features off, IP addresses and other collected information like phone number area codes can be used to estimate your "general location."

WhatsApp is required by law to share this information with authorities during an investigation...

[U]nder scrutiny is how Meta itself uses these precious details for commercial purposes. Again, this is clearly stated in WhatsApp's privacy policy and terms of use. "We may use the information we receive from [other Meta companies], and they may use the information we share with them, to help operate, provide, improve, understand, customize, support, and market our Services and their offerings," reads the policy. This means that yes, your messages are always private, but WhatsApp is actively collecting your metadata to build your digital persona across other Meta platforms...

The article suggests using a VPN with WhatsApp and turning on its "advanced privacy feature" (which hides your IP address during calls) and managing the app's permissions for data collection. "While these steps can help reduce the amount of metadata collected, it's crucial to bear in mind that it's impossible to completely avoid metadata collection on the Meta-owned app... For extra privacy and security, I suggest switching to the more secure messaging app Signal."

The article also includes a cautionary anecdote. "It was exactly a piece of metadata — a Proton Mail recovery email — that led to the arrest of a Catalan activist."

Thanks to long-time Slashdot reader united_notions for sharing the article.
Privacy

Doc Searls Proposes We Set Our Own Terms and Policies for Web Site Tracking (searls.com) 33

Today long-time open source advocate/journalist Doc Searls revealed that years of work by consumer privacy groups has culminated in a proposed standard "that can vastly expand our agency in the digital world" — especially in a future world where agents surf the web on our behalf: Meet IEEE P7012 , which "identifies/addresses the manner in which personal privacy terms are proffered and how they can be read and agreed to by machines." It has been in the works since 2017, and should be ready later this year. (I say this as chair of the standard's working group.) The nickname for P7012 is MyTerms (much as the nickname for the IEEE's 802.11 standard is Wi-Fi).

The idea behind MyTerms is that the sites and services of the world should agree to your terms, rather than the other way around.

Basically your web browser proffers whatever agreement you've chosen (from a canonical list hosted at Customer Commons) to the web sites and other online services that you're visiting.

"Browser makers can build something into their product, or any developer can make a browser add-on or extension..." Searls writes. "On the site's side — the second-party side — CMS makers can build something in, or any developer can make a plug-in (WordPress) or a module (Drupal). Mobile app toolmakers can also come up with something (or many things)..." MyTerms creates a new regime for privacy: one based on contract. With each MyTerm you are the first party. Not the website, the service, or the app maker. They are the second party. And terms can be friendly. For example, a prototype term called NoStalking says "Just show me ads not based on tracking me." This is good for you, because you don't get tracked, and good for the site because it leaves open the advertising option. NoStalking lives at Customer Commons, much as personal copyrights live at Creative Commons. (Yes, the former is modeled on the latter.)
"[L]et's make this happen and show the world what agency really means," Searls concludes.

Another way to say it is they've created "a draft standard for machine-readable personal privacy terms." But Searl's article used a grander metaphor to explain its significance: When Archimedes said 'Give me a place to stand and I can move the world,' he was talking about agency. You have no agency on the Web if you are always the second party, agreeing to terms and policies set by websites.

You are Archimedes if you are the first party, setting your own terms and policies. The scale you get with those is One 2 World. The place you stand is on the Web itself — and the Internet below it.

Both were designed to make each of us an Archimedes.

Privacy

Hungary To Use Facial Recognition to Suppress Pride March (theguardian.com) 235

Hungary's Parliament not only voted to ban Pride events. They also voted to "allow authorities to use facial recognition software to identify attenders and potentially fine them," reports the Guardian. [The nationwide legislation] amends the country's law on assembly to make it an offence to hold or attend events that violate Hungary's contentious "child protection" legislation, which bars any "depiction or promotion" of homosexuality to minors under the age of 18. The legislation was condemned by Amnesty International, which described it as the latest in a series of discriminatory measures the Hungarian authorities have taken against LGBTQ+ people...

Organisers said they planned to go ahead with the march in Budapest, despite the law's stipulation that those who attend a prohibited event could face fines of up to 200,000 Hungarian forints [£425 or $549 U.S. dollars].

Slashdot Top Deals