Google Aims To Cull Child Porn By Algorithm, Not Human Review 306
According to a story at VentureBeat, "Google is working on a new database of flagged images of child porn and abuse that can be shared with other search engines and child protection organizations. The database will help create systems that automatically eliminate that sort of content. ...
If the database is used effectively, any flagged image in the database would not be searchable through participating search engines or web hosting providers. And maybe best of all, computers will automatically flag and remove these images without any human needing to see them." Here's the announcement.
I hope they really mean child (Score:5, Interesting)
If they mean "all underage" and not just "blatantly children", good luck with that. There are no characteristics that will distinguish between 17 and 18, or even older. What is the software going to think of Kat Young, for example? What about models who are just small?
Also are they going to attempt to sort through drawings at all, considering they are legal in some jurisdictions and not others?
I sense false positives and angry models in Google's future.
Re:What is the point of this? (Score:2, Interesting)
The problem with arresting the people who are making child porn is that it provides the government no excuse to monitor all internet traffic of innocent citizens.
Re:Not sure I agree 100% that this is a good idea. (Score:4, Interesting)
Exactly - and how do you define underage? (Score:5, Interesting)
The age of consent in spain is 14, in the uk 16, in the USA 18 , so if there's a picture of a nude 15 or 17 year old in what country does it get to decided if its legal?
While this may be a laudable effort I have the sneaking feeling the USA once again will be pushing its legal system and morality onto the rest of the world.
Re:What is the point of this? (Score:0, Interesting)
So Google are attempting to create a system to automatically disappear content from the Internet with no human supervision, not only that but it also autonomously informs law enforcement of "child porn" found on these sites and you think thats to their credit?
I'd hate to see what you think is a step too far.
Re:What is the point of this? (Score:3, Interesting)
Making it more difficult to find may just be one portion of the strategy - no doubt the location of the images is reported to the relevant authorities, and then it's their job to take up the issue. Perhaps reducing access to the material will reduce the ability for people that search for it to find it, which may reduce the number of cases where the activity escalates to direct abuse. Maybe it'll increase the number of abuse cases as they're unable to relieve their desires and turn to local sources.
Although I think that people abusing children should be outright shot. I have trouble following the logic of the above statement. There used to be a under the counter market for such material and big bucks could be made with it. (The internet basically killed that market, hopefully.) Here there was a real economic incentive to produce material and demand encouraged production. But now, thanks to police work, there is little to no commercial trade of the material. Because the material is such a hot potato, people searching and distributing the material are forced to use means of strong anonymisation and thus no economic transaction can occur. I think most CP created nowadays is distributed along the same lines are people uploading their private videos to porn sharing sites.
The service by Google is very useful to prevent from services hosting the stuff. Since hosting stuff, even for a very short period of time is always bad PR. What Google builds is basically for PR, for Google and everybody using it.
Re:What is the point of this? (Score:3, Interesting)
So Google are attempting to create a system to automatically disappear content from the Internet with no human supervision, not only that but it also autonomously informs law enforcement of "child porn" found on these sites and you think thats to their credit?
I was thinking to post this anon, but no. I've personally known people who were victims of such horrible crimes. So my views on this very strong (and biased should you wish to seem them that way.)
This is a step in the right direction. Be it by peer review or by an algorithm, detect something that is "child porn". Flag it and rate it. Who published it and how it got searched (a profile that could suggest a rating from "accidental" find to "found by premeditated and methodical searches"). Report it to the authorities, and let them sort it out.
If I were to see or glimpse or even take a whiff of an indication of child porn on someone's computer or a person's private property, or whatever, you can bet your ass that I would call 911, and let the cops sort it out.
Turned out to be a false positive? I won't sleep much over it, unless someone genuinely innocent gets destroyed by this. But probabilistically, the chance of such occurrence is so low, not impossible, but so extremely low (truly innocent person in possession of child porn), I'm willing to live with the consequences.
People conflating this with 1984 or PRISM, that's NIMBYstically, selfishly disturbing and disconcerting.
I'd hate to see what you think is a step too far.
I'd hate to see what you think is a step too short when fighting child sexual abuse. Seriously, what do you propose in practical terms?
Re:Exactly - and how do you define underage? (Score:4, Interesting)
The age of consent for porn is 18 throughout the entire United States.
This has created all sorts of problems for people in that 16-18 age bracket taking photos of themselves or their partners, ruining their lives with chargers of "manufacturing child pornography," but hey, we have to "protect the children," don't we.
Re:Impossible (Score:4, Interesting)
A lot of people are misunderstanding what Google is developing here.
This is not "automated" censorware that would make its own decisions as to what should be censored. It's not going to analyze images and decide to censor them on its own (which would result in the kind of false positives you claim).
It's censorware that would, once a live human Google employee has viewed a piece of content and made the decision that that particular content is to be suppressed, globally remove all copies of the same content from Google's database. See, one of the major obstacles censors face nowadays is the so-called "Streisand effect," where suppressing a piece of offensive content results in dozens, sometimes thousands, of people mirroring the content and publicizing their mirrors. If people want the information, they'll get it; censorship always fails, and in many cases completely backfires.
But, software like this will ensure that in the future, if a corporation or a government wants to suppress information, they will be able to do so.
Of course, Google and others will only use this censorware to go after CP, which as we all know "everyone" hates, so I guess everything is okay.
Re:What is the point of this? (Score:5, Interesting)
The goal is to eliminate the known child porn, not to decide what is or is not child porn. Presumably their database will come from law enforcement agencies. That said, your comment brings up a valid point—that you can't always tell what is or is not legal. Those photos of that Hollywood actress a few years ago were taken when she was only 17. The fact that she took them and sent them to her boyfriend doesn't change the fact that (assuming she wasn't lying to try to make the photos go away) possessing those photos is technically illegal under child porn laws.
This raises a more interesting question: Will it be possible for individuals to query that database? For example, if someone downloads a picture from somewhere and can't tell with certainty whether the girl is legal or not, could that person anonymously query the database to determine the legality of that photo? I mean, obviously the database would be incomplete, so lack of data does not necessarily indicate legality, but I could see something similar to virus scanners that periodically scan your porn collection against the child porn database and alert you to anything that shows up in that database so you can delete it. That would be a useful tool for staying on the right side of a law that is otherwise basically impossible to comply with (at least with any degree of certainty), particularly if it came with information describing how the age of the person in particular photo was determined and the level of certainty involved. For example, for a presumably legal photo, it might return "Over 18. Certainty: low. Reason: not found in database" or it might contain "Over 18. Certainty: high. Reason: Playboy affidavit" or anything in between.