Google Aims To Cull Child Porn By Algorithm, Not Human Review 306
According to a story at VentureBeat, "Google is working on a new database of flagged images of child porn and abuse that can be shared with other search engines and child protection organizations. The database will help create systems that automatically eliminate that sort of content. ...
If the database is used effectively, any flagged image in the database would not be searchable through participating search engines or web hosting providers. And maybe best of all, computers will automatically flag and remove these images without any human needing to see them." Here's the announcement.
What is the point of this? (Score:5, Insightful)
What is the point of automatically removing child porn so it's not searchable. That's not the problem with child porn.
The problem with child porn is real children are being really abused to make it.
Making it "not searchable" doesn't stop that. Arresting the people who are making it does.
Re:What is the point of this? (Score:4, Insightful)
Also, I am browsing the net since at least 12 years and i have NEVER found child porn by accident or whatsoever. I am thinking that child porn can be found only in the "dark internet".
So that makes one wonder what Google's real motives are.
I have a better idea.... (Score:3, Insightful)
Re:What is the point of this? (Score:5, Insightful)
Not sure I agree 100% that this is a good idea.... (Score:5, Insightful)
Let me be clear about this. I DO NOT condone child pornography at all; I find it foul and disgusting. But there is a over-reaching that I think may go on here. If I purchase a server and I engage in a P2P network, then it is not Google nor any one else's business what I transmit. If the server is a public server or one owned by a company (such as Google), then I would agree they have every right to remove such foul content from their servers.
Yes I would rather that the people who engage in this be stopped. But whenever programs like this are created they tend to start out being put to use with the best of intentions, but will likely be used for other more nefarious purposes. If this algorithm is used to sniff out child pornography, it could be modified to sniff out a information about a political party and quell it, or news that a government agency doesn't want people to know about.
With all that has recently come to light about the spying by the US Govt. can you really say that this with 100% certainty that this technology won't be abuse for other purposes? I can't.
Again I DO NOT condone Child Pornography.
Seems like this could be used for other things (Score:3, Insightful)
Removing child pornogragphy is a laudable goal.
We just have to realize that it won't stop at that. From the what the article says it seems like that technology could be used for any image. At the very least I expect we'll see general copyright enforcement from this. Worst case we will see things like various regimes being able to use this to suppress images they don't like. Oh you have pictures of us slaughtering our opponents well we better put those on the bad list.
Out of sight, out of mind (Score:4, Insightful)
This will increase child abuse. As soon as it becomes invisible, perpetrators are completely free to do whatever they like, as the public will not be aware it is a problem. The reason is that it addresses the wrong problem. Distribution of CP is a minor issue. Creation of CP (and all the child abuse that is not documented or does not end up on the Internet) is the real problem. It seems politicians have become so focused on distribution of CP, that nothing is being done anymore to fight actual child abuse. After all, distribution of CP gives nice and easy convictions and to hell with the children themselves.
Re:What is the point of this? (Score:5, Insightful)
So that makes one wonder what Google's real motives are.
Good PR. I'm as cynical as the next person, but PR is often the only motive for these things. If they had a sinister motive, they'd just offer to help the NSA some more.
Re:What is the point of this? (Score:3, Insightful)
You are looking at this from the wrong angle. It is extremely likely that this is not about CP at all, but that Google wanted an effective image censorship system. They realized that this may be hard to sell to the public, so they found CP as an easy solution. They can even use it for that only for the first few months (which will be almost invisible, as there cannot be a lot of CP accessible via Google, if there is anything at all...), then they an start to put in other pictures that are "undesirable", like pictures of political protests, police brutality, etc. And if anybody protests, they can just report them for searching CP. When the life of the one protesting has been ruined, they can just blame it on a "technical problem".
Quite ingenious, if utterly evil.
Re:What is the point of this? (Score:4, Insightful)
Well, if it works to prevent people from seeing it unintentionally then it means the Google search engine provides more relevant search results. So that's a major improvement in Google's search engine.
If it's automatically identified removed, then presumably Google would be able to purge ephemeral copies from their caches and whatnot, which is probably nice from a liability perspective.
It might help to reduce casual interest in the subject if it's not easily searchable.
It doubt it would prevent anyone actively trying to find it, and it certainly won't stop the kinds of people who would go to the length of producing it; at least, I can't imagine that fame through improved search engine results is a significant part of their motivation.
The question is what is the impact on the people who might make a transition from casual interest (if they could view it by searching) to actual production? If it helps prevents that, it's a win. On the other hand, if these people deal with frustrated urges by just going ahead and making their own, we'd have to call it a major failure.
Ideally, someone has actually done the research and determined that yes, blocking casual searches for child porn should amount to a net benefit.
In practice it wouldn't surprise me if it's a move to reduce the threat from Attorney General's who see child porn in Google's search results as an easy PR and courtroom win.
Re:What is the point of this? (Score:3, Insightful)
Turned out to be a false positive? I won't sleep much over it, unless someone genuinely innocent gets destroyed by this. But probabilistically, the chance of such occurrence is so low, not impossible, but so extremely low (truly innocent person in possession of child porn), I'm willing to live with the consequences.
Go to 4chan for a few weeks. Your browser cache may now very well contain child porn, even though you did not want child porn, because posting to 4chan is anonymous and even though they do their best to take illegal images down they can't always tell if something is child porn and if it is, it will take them a while to find out about it. I'm pretty sure there are more readers of 4chan than there are pedophiles getting child porn on the internet, leaving your false positive rate at more than 50%. I know a kid who for some reason liked to amass vast quantities of porn on his computer. At meets he'd copy gigabytes of everyone else's stashes - more than he could ever watch. He ended up taking the rap for child porn charges because one of those images, perhaps an image he never saw, was child porn. In my home country, the police get many more panicked calls from people who inadvertently saw child porn on the internet (and are now horrified that they will be prosecuted for seeing it, which in fact could happen if it is in their browser cache) than they do leads on actual child rapists. Preventing child rape (or rape of any kind, for that matter) is a worthwhile endeavor. Doing something productive about it is good. Randomly ruining innocent people's lives is not productive - it's not what needs to be done, regardless of your ability to feel bad about it.