Google Aims To Cull Child Porn By Algorithm, Not Human Review 306
According to a story at VentureBeat, "Google is working on a new database of flagged images of child porn and abuse that can be shared with other search engines and child protection organizations. The database will help create systems that automatically eliminate that sort of content. ...
If the database is used effectively, any flagged image in the database would not be searchable through participating search engines or web hosting providers. And maybe best of all, computers will automatically flag and remove these images without any human needing to see them." Here's the announcement.
What is the point of this? (Score:5, Insightful)
What is the point of automatically removing child porn so it's not searchable. That's not the problem with child porn.
The problem with child porn is real children are being really abused to make it.
Making it "not searchable" doesn't stop that. Arresting the people who are making it does.
Re:What is the point of this? (Score:4, Insightful)
Also, I am browsing the net since at least 12 years and i have NEVER found child porn by accident or whatsoever. I am thinking that child porn can be found only in the "dark internet".
So that makes one wonder what Google's real motives are.
Re:What is the point of this? (Score:5, Insightful)
So that makes one wonder what Google's real motives are.
Good PR. I'm as cynical as the next person, but PR is often the only motive for these things. If they had a sinister motive, they'd just offer to help the NSA some more.
Re: (Score:2)
Ding, have a cookie.
"Think of the children!" is the perfect answer to "Hey, why are you handing all the data to the government investigators?"
Re: (Score:2)
"Think of the children!" is the perfect answer to "Hey, why are you handing all the data to the government investigators?"
Why do they need to answer that question? You're assuming the proles have any power. Besides, in 21st century America, "terrorism" trumps even "think of the children".
Re:What is the point of this? (Score:4)
1 - fighting terrorists
2 - fighting child pornography
3 - something with global warming
4 - helping the government spy on ordinary people.
Re: (Score:2)
- Fighting drugs
The constitution has been gutted in the name of stopping people from voluntarily ingesting certain substances since even before 9/11. I guarantee someone is going to stand up and say, 'Why can't we use this information to stop those evil drug traffickers?' and no one will object.
Re: (Score:2)
Re: (Score:2)
Perhaps they're tired of all the FBI /NSA requests and are trying to remove themselves out of that picture.
I also heard that google content inspector is a job that requires therapy.
Re:What is the point of this? (Score:5, Interesting)
The goal is to eliminate the known child porn, not to decide what is or is not child porn. Presumably their database will come from law enforcement agencies. That said, your comment brings up a valid point—that you can't always tell what is or is not legal. Those photos of that Hollywood actress a few years ago were taken when she was only 17. The fact that she took them and sent them to her boyfriend doesn't change the fact that (assuming she wasn't lying to try to make the photos go away) possessing those photos is technically illegal under child porn laws.
This raises a more interesting question: Will it be possible for individuals to query that database? For example, if someone downloads a picture from somewhere and can't tell with certainty whether the girl is legal or not, could that person anonymously query the database to determine the legality of that photo? I mean, obviously the database would be incomplete, so lack of data does not necessarily indicate legality, but I could see something similar to virus scanners that periodically scan your porn collection against the child porn database and alert you to anything that shows up in that database so you can delete it. That would be a useful tool for staying on the right side of a law that is otherwise basically impossible to comply with (at least with any degree of certainty), particularly if it came with information describing how the age of the person in particular photo was determined and the level of certainty involved. For example, for a presumably legal photo, it might return "Over 18. Certainty: low. Reason: not found in database" or it might contain "Over 18. Certainty: high. Reason: Playboy affidavit" or anything in between.
Re: (Score:3)
Even better would be if Google provided open source tools to allow website owners to check images (again, anonymously) against Google's database as they are being uploaded by users. By tying into such a service, bulletin board tools could make it really easy for site admins to ensure that posting child porn on their boards was very, very hard. This would be particularly useful for newly established boards where the number of users is small enough that such postings might otherwise go unnoticed for an ext
Re:What is the point of this? (Score:4, Informative)
Unfortunately it is out there. In a previous life as an intern I received a computer from a retail store that needed "fixed" as the store manager put it. Figuring it had some malware on it I booted it up to see what the damage was. Almost as soon as the computer was started numerous browser sessions autostarted with some of the most vile websites you wouldn't want to imagine. It wasn't a picture or two of some amateur girlfriend that might have been a little too young. They had the appearance of professionally designed and maintained websites just like any other porn website, but just happened to have kids 13- instead of 18+. I just turned off the computer, went to my boss, explained briefly what I found and said I wasn't dealing with it.
That was 13+ years ago. I'm sure things have changed some since then, but I'm also not naive to think that child porn is just on the "dark internet" whatever that is.
Re: (Score:3)
dark internet
FWIW, according to Wikipedia the term you're looking for is 'darknet' or 'deep web' [wikipedia.org]. I love clear terminology.
Re: (Score:3)
Re: (Score:2)
More recently, remember that girl that did the 'my life sucks' then suicided? Uncensored autopsy pics got pulled up by Google- CP according to some definitions.
Still, totally agree with grandparent post
Re: (Score:3)
Obviously, you have never hanged out on 4chan.
It is very much alive and kicking on the regular net, else police wouldn't be apprehending so many pervs (tracking people on TOR is almost impossible).
Re:What is the point of this? (Score:4, Informative)
The summary is a bit incomplete. I suppose that if the algorithm finds something, it will warm law enforcement.
FTFA: "This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals." "We can do a lot to ensure it’s not available online—and that when people try to share this disgusting content they are caught and prosecuted. "
Re:What is the point of this? (Score:5, Funny)
if the algorithm finds something, it will warm law enforcement.
I may not always agree with law enforcement, but I do not think they are THAT corrupt.
Re: (Score:3)
Woops, I meant warn, I MEANT WARN!
Re: (Score:2, Interesting)
The problem with arresting the people who are making child porn is that it provides the government no excuse to monitor all internet traffic of innocent citizens.
Re:What is the point of this? (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3, Interesting)
So Google are attempting to create a system to automatically disappear content from the Internet with no human supervision, not only that but it also autonomously informs law enforcement of "child porn" found on these sites and you think thats to their credit?
I was thinking to post this anon, but no. I've personally known people who were victims of such horrible crimes. So my views on this very strong (and biased should you wish to seem them that way.)
This is a step in the right direction. Be it by peer review or by an algorithm, detect something that is "child porn". Flag it and rate it. Who published it and how it got searched (a profile that could suggest a rating from "accidental" find to "found by premeditated and methodical searches"). Report it to the
Re:What is the point of this? (Score:5, Informative)
Turned out to be a false positive? I won't sleep much over it, unless someone genuinely innocent gets destroyed by this. But probabilistically, the chance of such occurrence is so low, not impossible, but so extremely low (truly innocent person in possession of child porn), I'm willing to live with the consequences.
You, sir, are a grade-A asshole
Are you considering the possibility that you might be the target of the false positive (however unlikely)? Because it is mighty kind of you to be willing to live with the consequences of someone else's life being ruined due to false positive.
All of this would be a less horrible idea if the law enforcement found a less damaging way to investigate (i.e. keep the accusation completely private until it is proven in court). Otherwise lives are utterly ruined well before the investigation is concluded.
Re: (Score:2)
Turned out to be a false positive? I won't sleep much over it, unless someone genuinely innocent gets destroyed by this. But probabilistically, the chance of such occurrence is so low, not impossible, but so extremely low (truly innocent person in possession of child porn), I'm willing to live with the consequences.
You, sir, are a grade-A asshole
Since you say so, it must be so.
Are you considering the possibility that you might be the target of the false positive (however unlikely)?
But of course I have. But since your question presupposes an answer (the one you hope to have with which to prop up your privacy soap box), I'll let you think I answer in the negative and let you claim a rhetorical win.
Because it is mighty kind of you to be willing to live with the consequences of someone else's life being ruined due to false positive.
As opposed to paralysis by analysis and do nothing about child sexual abuse because ZOMG I might be wrong. Nice NIMBY morality you have there buddy.
All of this would be a less horrible idea if the law enforcement found a less damaging way to investigate
But they haven't because it is not a trivial problem. But since you seem to be the genius here, I'm dying to hear your solution.
(i.e. keep the accusation completely private until it is proven in court).
Uh, but to be proven in court, it has first to be reported to the authorities, and before being reported to the authorities, it must first be flagged somehow.
And that is what the algorithm does (or should do.) You keep bringing the word "privacy" as if the purpose of the algorithm is going to publish into the open who searches what. Quite the contrary because the algorithm hides/blocks the matches to begin with.
Nice red herring/ad hominem though.
Otherwise lives are utterly ruined well before the investigation is concluded.
As opposed to lives that are not hypothetically ruined, but actually ruined in real life as the result of real child sexual abuse. So who is the asshole again?
So genius, since I'm an asshole (and dumb and stupid) please pray tell your solution to this problem. What do you propose? What do you suggest?
Or are you just actively searching for a topic to latch on and publish your anguish on the interwebz, to post how upset you are about violation of liberties in terms of the hypothetical which trumps the need to deal effectively with child sexual abuse?
Re: (Score:2)
I understand his position, and I don't think he's an asshole, but you do have a point. Simply sullying your hard drive with that stuff doesn't mean that you actually in any way promoted the original abuse. So, to my mind, even CP does not justify extreme action, if it is only possession.
Possession does make you a sick fuck, of course.
Still, I might consider what Google is doing a public service in another way. I don't want CP on my computer, and if Google removes it as something that may pop up on my com
Re: (Score:3)
You are walking a dangerous road, friend.
Before you talk, you should think about all the angles. Think about what it means to flag someone as suspicious, think about how easy it is to make someone look suspicious, think about how easy it would be for someone who doesn't like YOU to make YOU look suspicious, and think about how easy it would be to sabotage anything on the internet when all it takes to "temporarily" censor something is a child-porn flag.
Before you think I'm conflating things, before you star
Re: (Score:3, Insightful)
Turned out to be a false positive? I won't sleep much over it, unless someone genuinely innocent gets destroyed by this. But probabilistically, the chance of such occurrence is so low, not impossible, but so extremely low (truly innocent person in possession of child porn), I'm willing to live with the consequences.
Go to 4chan for a few weeks. Your browser cache may now very well contain child porn, even though you did not want child porn, because posting to 4chan is anonymous and even though they do their best to take illegal images down they can't always tell if something is child porn and if it is, it will take them a while to find out about it. I'm pretty sure there are more readers of 4chan than there are pedophiles getting child porn on the internet, leaving your false positive rate at more than 50%. I know a ki
RTFA; it involves more than that. (Score:3)
You are absolutely correct that this won't make child porn disappear. But from Google's standpoint, it will help keep their top-notch search engine (and other search engines) from being used to find it. In addition, it's more than making it "not searchable"; RTFA. This will also have "hooks" into law enforcement and ISPs.
Re: (Score:2)
Making it "not searchable" doesn't stop that. Arresting the people who are making it does.
Nope, it makes it more valuable to the people who distribute it - no more pesky freeloaders!
(just like drugs, etc.)
Re: (Score:3, Insightful)
You are looking at this from the wrong angle. It is extremely likely that this is not about CP at all, but that Google wanted an effective image censorship system. They realized that this may be hard to sell to the public, so they found CP as an easy solution. They can even use it for that only for the first few months (which will be almost invisible, as there cannot be a lot of CP accessible via Google, if there is anything at all...), then they an start to put in other pictures that are "undesirable", lik
Re: (Score:2)
Yes, that's right, Google, the firm that publicises all DMCA requests it receives and flags up when it's been forced to censor search results by linking to the request on relevant searches, Google, that publishes the source code for many of it's products, Google, that produces a regular transparency report stating as much as it can about what data it's been requested to hand over, what it's been requested to censor and so forth has just arbitrarily decided one day that it wants to censor images. We don't kn
Re: (Score:2)
I'll take your inability to explain the nonsensical unexplained elements of your conspiracy theory as you saying "Yes, I was just trolling" then I guess seeing as rather than offer any explanation that would make your argument make sense you just take the ad hominem route and leave your argument making absolutely no sense whatsoever.
Re: (Score:2)
Rather than continuing to make a fool of yourself, you could just admit you were wrong and accept it.
You know, just saying, just throwing that out there in case you decide you don't want to keep looking stupid.
CP (disambigger-than-yours) (Score:2)
And if anybody protests, they can just report them for searching CP.
At which point plausible deniability [twitter.com] adds reasonable doubt. See also CP (disambiguation) [encyclopediadramatica.se].
Re:What is the point of this? (Score:4, Insightful)
Well, if it works to prevent people from seeing it unintentionally then it means the Google search engine provides more relevant search results. So that's a major improvement in Google's search engine.
If it's automatically identified removed, then presumably Google would be able to purge ephemeral copies from their caches and whatnot, which is probably nice from a liability perspective.
It might help to reduce casual interest in the subject if it's not easily searchable.
It doubt it would prevent anyone actively trying to find it, and it certainly won't stop the kinds of people who would go to the length of producing it; at least, I can't imagine that fame through improved search engine results is a significant part of their motivation.
The question is what is the impact on the people who might make a transition from casual interest (if they could view it by searching) to actual production? If it helps prevents that, it's a win. On the other hand, if these people deal with frustrated urges by just going ahead and making their own, we'd have to call it a major failure.
Ideally, someone has actually done the research and determined that yes, blocking casual searches for child porn should amount to a net benefit.
In practice it wouldn't surprise me if it's a move to reduce the threat from Attorney General's who see child porn in Google's search results as an easy PR and courtroom win.
Re: (Score:2)
Re: (Score:3)
The point is to try to sell automated censorware to the public by saying it'll only be used against something "everyone" thinks ought to be censored. Once it's established, it's scope will be expanded to cover all sorts of other materials.
Re: (Score:2)
Just a minor correction.
Reduce demand, reduce supply (Score:2)
What is the point of automatically removing child porn so it's not searchable. That's not the problem with child porn.
The problem with child porn is real children are being really abused to make it. Making it "not searchable" doesn't stop that. .
The point, I would expect, is that by removing the channel by which it circulates puts a barrier between the demand and the source, and hence reduces the incentive to make it. That would reduces the amount which is made.
Arresting the people who are making it does.
I don't think that this proposal was intended to be instead of arresting the people who make it.
With that said, your point "The problem with child porn is real children are being really abused to make it." is a good one. By that argument, any such material which was not produced using real
Re: (Score:2)
The point, I would expect, is that by removing the channel by which it circulates puts a barrier between the demand and the source, and hence reduces the incentive to make it.
That, in fact, was the judicial reasoning on the constitutionality of child pornography laws. At this point, though, I have my doubts that this sort of economic argument is valid; I doubt that the consumers of child pornography are paying for it in any way. After all, we are supposed to believe that the Internet has ruined the MPAA's and RIAA's business, and child pornography is obviously not subject to copyrights.
Re: (Score:2)
Agreed, but they're both parts of a joint strategy. Google is helping make the stuff unfindable ... which makes it go more underground, raising its value, which gives a much bigger incentive to the dirtbags that do this stuff. So ... oops ... this is the digital equivalent of the Prohibition laws of the 1930s.
Imagine if this leads someone to build a custom search engine that specifically goes out to find child porn, because the Google search engine is functionally crippled. Oops.
I started this post think
Re: (Score:3, Interesting)
Making it more difficult to find may just be one portion of the strategy - no doubt the location of the images is reported to the relevant authorities, and then it's their job to take up the issue. Perhaps reducing access to the material will reduce the ability for people that search for it to find it, which may reduce the number of cases where the activity escalates to direct abuse. Maybe it'll increase the number of abuse cases as they're unable to relieve their desires and turn to local sources.
Although I think that people abusing children should be outright shot. I have trouble following the logic of the above statement. There used to be a under the counter market for such material and big bucks could be made with it. (The internet basically killed that market, hopefully.) Here there was a real economic incentive to produce material and demand encouraged production. But now, thanks to police work, there is little to no commercial trade of the material. Because the material is such a hot potato, p
Re: (Score:3)
Making it more difficult to find may just be one portion of the strategy - no doubt the location of the images is reported to the relevant authorities, and then it's their job to take up the issue.
Making it more difficult means someone has to put more (recorded and archived for 2yrs by your ISP) effort into "accidentally" finding the same "abused kid of the week" site every Saturday night for the last 18 months. Good luck explaining "an evil hacker did it" to a judge and jury.
Re: (Score:2)
It's worked so well in the drug war, now hasn't it?
I hope they really mean child (Score:5, Interesting)
If they mean "all underage" and not just "blatantly children", good luck with that. There are no characteristics that will distinguish between 17 and 18, or even older. What is the software going to think of Kat Young, for example? What about models who are just small?
Also are they going to attempt to sort through drawings at all, considering they are legal in some jurisdictions and not others?
I sense false positives and angry models in Google's future.
Re: (Score:2)
Yes, this seems to be a difficulty they may face, and something I would like to see addressed. While I have seen things I would rather have not (and not necessarily related to this topic), I'm not entirely sure I would deem it illegal. Maybe I'm just biased by finding it as something that just comes with the Internet as part of the package. That and as long as it's not shoved in my face I don't really mind it existing*
*of course, child porn/abuse is illegal, so my personal view on it is kind of irrelevant
Singing "Happy Birthday" in public is illegal (Score:2)
of course, child porn/abuse is illegal
Singing "Happy Birthday to You" in public is illegal too. There are cases where current law is out of sync with social norms.
Re: (Score:3)
I for one can't wait until the Aussie government get in on this. Women with small breasts will be flagged and, as you said, drawings too.
It is a good effort, but the world is really becoming just a little too fucked up to start trying to stop things now.
Re: (Score:2)
The summary is a bit off. The algorithm can't actually detect child porn, what this looks like is a system similar to the one Youtube uses where one item gets reported and it's blocked globally rather than have to have someone report each instance of the same image.
Re:I hope they really mean child (Score:5, Informative)
Since 2008, we’ve used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique ID that our computers can recognize without humans having to view them again . Recently, we’ve started working to incorporate encrypted “fingerprints” of child sexual abuse images into a cross-industry database. This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals. Today we’ve also announced a $2 million Child Protection Technology Fund to encourage the development of ever more effective tools. [emphasis added]
Re: (Score:2)
As usually by the time it made it to the Slashdot headline it was completely miss-reported. They are working on a DB to share hashes of known images. This only prevents them having to review the image on each new URL, someone still has to have seen the image and added it to the DB (to be honest they had better be conforming these flags at least sometimes so that's two people etc.etc.)
Re: (Score:3)
This system is not "looking" at images. It is a database of hashes of known offending files, against which found content can be compared. Matching content will be filtered.
Of course this only works for known files (which are flagged by humans, I supposed, though that is not explicitly mentioned in TFA), and if a file is altered the hash changes. Though that doesn't happen too often, most people share content they find unaltered. And it doesn't work for new files, either. Those still need to be flagged - how
Re: (Score:2)
If you look at how good / bad the YouTube content-id system works you can infer how this system works. Probably the same tech, which has some leeway for fuzzy mapping.
Exactly - and how do you define underage? (Score:5, Interesting)
The age of consent in spain is 14, in the uk 16, in the USA 18 , so if there's a picture of a nude 15 or 17 year old in what country does it get to decided if its legal?
While this may be a laudable effort I have the sneaking feeling the USA once again will be pushing its legal system and morality onto the rest of the world.
Re: (Score:2)
age of consent != age to publish pornographic content
AFAIK almost everywhere you have to be an adult (18/21) to view and publish pornographic content.
Re: (Score:2)
In the UK, the age of consent is 16, but it is illegal to publish pictures of the activity if you are under 18.
Re:Exactly - and how do you define underage? (Score:4, Interesting)
The age of consent for porn is 18 throughout the entire United States.
This has created all sorts of problems for people in that 16-18 age bracket taking photos of themselves or their partners, ruining their lives with chargers of "manufacturing child pornography," but hey, we have to "protect the children," don't we.
Re: (Score:2)
There is federal law defining this. 18 USC 2256, 2257.
Many states are being coerced into adopting parallel state statutes. New Hampshire here passed one 3-4 years ago. But since anything posted on the Internet the feds can easily claim "travelled in interstate commerce," the feds will apply their laws regardless of what any state says.
Re: (Score:2)
Re: (Score:2)
If they mean "all underage" and not just "blatantly children", good luck with that. There are no characteristics that will distinguish between 17 and 18, or even older. What is the software going to think of Kat Young, for example? What about models who are just small?
This is a common matching problem for which a solution has been know for quite a long time: Probabilistic weights.
Algorithms can simply apply a probabilistic weight to the matches. Kat Young alike = 0.001. Obvious toddler = 1.0. Barring the Kat Young/small model outliers, there are physical markers that can distinguish between a normal 17 year old from a 10 year old from a 4 year old.
You, or rather we, cannot do anything about in terms of automatic (or even manual/visual) matching of physical characteris
Re: (Score:2)
Hope that it is just some characteristics, such as file hashes. Because the DB with actual pictures, even "not searchable", would probably be illegal at least as the laws are spelled out currently.
I support this (Score:2)
I have a better idea.... (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
That tactic has worked so well eliminating all the pirate sites out there, hasn't it.
Not sure I agree 100% that this is a good idea.... (Score:5, Insightful)
Let me be clear about this. I DO NOT condone child pornography at all; I find it foul and disgusting. But there is a over-reaching that I think may go on here. If I purchase a server and I engage in a P2P network, then it is not Google nor any one else's business what I transmit. If the server is a public server or one owned by a company (such as Google), then I would agree they have every right to remove such foul content from their servers.
Yes I would rather that the people who engage in this be stopped. But whenever programs like this are created they tend to start out being put to use with the best of intentions, but will likely be used for other more nefarious purposes. If this algorithm is used to sniff out child pornography, it could be modified to sniff out a information about a political party and quell it, or news that a government agency doesn't want people to know about.
With all that has recently come to light about the spying by the US Govt. can you really say that this with 100% certainty that this technology won't be abuse for other purposes? I can't.
Again I DO NOT condone Child Pornography.
Re:Not sure I agree 100% that this is a good idea. (Score:4, Interesting)
Re: (Score:2)
Now I will only see what Google has decided it is in my interest to see.
They've been doing that for years.
Re: (Score:2)
Google and other search engines filter content already - like Google's "safe search" options to block images showing naked people to appear in their image search. The technology exists, and "safe search" appears to actually analyse images to judge the content, while this child porn database only compares file hashes against known offending content.
The technology is there, it's not new, this is just a new application of it. And I have to say I'm quite confident that it's not being used for political purposes
And that's a reason to avoid a useful tool? (Score:2)
This is certainly, unarguably, a useful tool that can be used in order to accomplish a worthy societal goal; I don't think our criteria for such things should be: "Well, it could be used for bad things, so we should stick our heads in the sand instead." No cars because they might be driven by bank robbers! No knives because they might be used to cut people instead of carrots! etc.
In any case, content recognition algorithms already exist and are already used for nefarious purposes. Why not use those tools
Re: (Score:3)
Re: (Score:2)
Good grief, how hard is it to not be paranoid is this day and age? They don't want child porn to show up on their searches so they find a way to flag the images and not display them : that's it. If there comes a time when the technology is used malevolently to suppress political ideas, then you'll be able to bitch about it but going all "tinfoil hat" on it because there might me an hypothetical use which you don't like is just ridiculous.
At the moment, Google uses mostly human beings to review flagged conte
Seems like this could be used for other things (Score:3, Insightful)
Removing child pornogragphy is a laudable goal.
We just have to realize that it won't stop at that. From the what the article says it seems like that technology could be used for any image. At the very least I expect we'll see general copyright enforcement from this. Worst case we will see things like various regimes being able to use this to suppress images they don't like. Oh you have pictures of us slaughtering our opponents well we better put those on the bad list.
Out of sight, out of mind (Score:4, Insightful)
This will increase child abuse. As soon as it becomes invisible, perpetrators are completely free to do whatever they like, as the public will not be aware it is a problem. The reason is that it addresses the wrong problem. Distribution of CP is a minor issue. Creation of CP (and all the child abuse that is not documented or does not end up on the Internet) is the real problem. It seems politicians have become so focused on distribution of CP, that nothing is being done anymore to fight actual child abuse. After all, distribution of CP gives nice and easy convictions and to hell with the children themselves.
Re: (Score:3)
This will increase child abuse. As soon as it becomes invisible, perpetrators are completely free to do whatever they like, as the public will not be aware it is a problem.
This probably won't do squat to make it less visible, because it's already reasonably well hidden. Any poster of CP that's too dumb to keep it out of the range of search engines has probably already been caught. This is a feel good effort. I can't criticize it, but it won't have much effect one way or the other.
...What could go wrong? (Score:2)
sheesh
See the old story, by CM Kornbluth called the marching morons
Misleading summary (Score:2)
Impossible (Score:2)
Judges have already declared that porn is basically undefinable, and I disagree with them that you know it when you see it.
Added to this that you cannot tell the difference between a 15 yo and a 21 100% of the time, sure 95% you would get it right with that big of a range, but not always.
And trying to tell the difference between 18 and 17 or 16 if more like a 50% chance of getting it right, regardless of if you are a computer or a human being.
Re: (Score:3)
Re:Impossible (Score:4, Interesting)
A lot of people are misunderstanding what Google is developing here.
This is not "automated" censorware that would make its own decisions as to what should be censored. It's not going to analyze images and decide to censor them on its own (which would result in the kind of false positives you claim).
It's censorware that would, once a live human Google employee has viewed a piece of content and made the decision that that particular content is to be suppressed, globally remove all copies of the same content from Google's database. See, one of the major obstacles censors face nowadays is the so-called "Streisand effect," where suppressing a piece of offensive content results in dozens, sometimes thousands, of people mirroring the content and publicizing their mirrors. If people want the information, they'll get it; censorship always fails, and in many cases completely backfires.
But, software like this will ensure that in the future, if a corporation or a government wants to suppress information, they will be able to do so.
Of course, Google and others will only use this censorware to go after CP, which as we all know "everyone" hates, so I guess everything is okay.
Re: (Score:2)
OK, but if you actually read my post I was clearly saying that computers cannot do it, because it is simply impossible for anyone or thing to do it because they are indestrinquinable.
Porn is often indistinguishable from non-porn, and children are indistinguishable from non-children. These are not just tiny edge cases, but significant portions of the content.
no profit here (Score:2)
2. End up on floor being beaten by local SWAT team.
3. ??????
4. Prison
Unintended consequences? (Score:2)
If this system were 100% effective and preventing all known CP images from being searchable or even downloaded, then wouldn't that drive demand for brand new images to be created that don't trip the filters?
Re: (Score:2)
If this system were 100% effective
It won't be, and no-one's expecting it to be. From the article it may be something as simple as storing hashes for specific files rather than specific images, which would hardly warrant the "new technology" status that the article has decided to give it. Even if it's more generic that, and is a hash for an image - allowing an image to be identified at different scales, exposures, or rotations - it's still not really new technology, nor is it particularly groundbreaking. It'll just be a handy tool which will
Pornography definitions (Score:2)
i think the problem with this system lies not in its intent but in its effects. I'm less concerned about whether it is searchable than about the abuses involved in creating it. I'm also concerned about the fact that we've seen charges involving 14-17 year old girls sending 14-17 year old boys their own pictures via their cellphones, marking them as felons and sex offenders for life. We need to figure out what is and isn't acceptable in our society and make it clear where that line is before matters get w
AV scanners (Score:2)
How come corporate Anti-Virus scanners don't scan (or have an add-on module to scan) for signatures of illicit images? If various government agencies have a collection of known infringing images, signatures could be generated, like viruses. Sure, there would likely be a way to fool it, but it would be step in the right direction.
If Google Is Succesfull (Score:2)
In building an algorythim that can detect CP. Then they can combine it with their location detection methods to identify where it was produced. The big question is, are they then going to inform the LEO's based on the suspicion and ruin an individual who did not commit a crime? Are they becoming the new Thought Police ala "Orwells 1984"?
Some of the things that Google has done are worthy efforts but this has the seeds of some serious abuse right from the beginning because if Google can do this for images, th
Good intentions, but a bad idea. (Score:3)
What about borderline content such as non-pornographic nudity, sexually explicit drawings of imaginary minors, and pornographic images of adults who look like teenagers? It's likely these will be branded as "child pornography", leading to images being suppressed that are legal in many jurisdictions including the United States.
Once service providers start censoring content based on third party reports of alleged child pornography, it becomes much easier to supress other content as well. Organizations such as RIAA and MPAA would love to be able to flag arbitrary content as infringing and have ISPs block such content automatically, bypassing even the need to file DMCA takedown notices. Think of how often YouTube videos are incorrectly flagged as examples copyright infringement and extend this to all ISPs who check against Google's database, and you can see the problem.
ISPs who participate in this system delegate the right to make judgment calls on material that isn't obviously illegal to the maintainers of a central database whose judgment may or may not be consistent with local law. Anything in the database is assumed to be illegal regardless of its actual legal status, and the ISPs just follow along instead of deciding individually whether or not the content is likely to survive a legal challenge. Once the system becomes widespread, ISPs may even feel it is necessary to follow it to avoid secondary liability for content posted by their users.
This is yet another example of a worrying trend, where content alleged to be illegal or infringing is removed without due process and often with little regard for the law and relevant jurisprudence. It's no way to run a network that for many has become a primary means of communication.
Internet users deserve better than to have their content blocked according to extralegal judgments with perhaps no bearing on local law, little or no chance of appeal, and no way to establish legal precedents protecting certain kinds of content.
Re: (Score:2)
"What if iPhone gets flagged and only Android phones shows up in searches?"
Then the world will be a better place.
Re: (Score:3)
Police are legally allowed to possess contraband in the course of an investigation; private-sector entities aren't, absent some exception in the law permitting them to. For example, you can't keep a large collection of drugs for research purposes (e.g. training drug-detecting sensors) unless you apply for special permits.
Re: (Score:3)
Similarly, with child porn, you'd need only one sample of every instance of child porn. There's no use keeping 20 copies of the same image. But because the smallest amount of each instance is that inst
Re: (Score:2)
Quite impossible. You are completely and utterly clueless of the difficulties involved _and_ you are clueless about your cluelessness. Look up the Dunning-Krueger Effect. You are on the far left of the curves.
Re: (Score:2)
It seems unlikely that there is much (or any at all) CP that can be found using a search engine (have not tried, but others have), as everything findable with a search engine is easily reported to law enforcement and traceable back to the ones putting it there. This strongly indicates that Google wanted (or was coerced) to implement image censorship and is just using CP as an easy and plausible to the clueless excuse. It is, of course, completely bogus. Once you look at the facts, it makes zero sense. And i
Re: (Score:2)
I don't think they're going to be using computer vision to try to identify child porn, they'll probably be using a database of hashes (either file hashes, or some kind of "image hash" that can identify pics even if they've been resized, recompressed, added a watermark etc) of known child porn. It's slightly helpful and has a vanishingly small chance of false positives.
Re: (Score:2)
As to your 1st paragraph, do you know anyone who disagrees? And if they did, they wouldn't say so.
As to your 2nd paragraph, there was a Slashdot story a while ago (can't find it now) about Google temps who were hired for just that kind of stuff. They were "released" after 6-12 months, and really did have psychological problems because of it. Of course no help or assistance of any kind was offered. What do you think this is, the 20th century?
Re: (Score:2)
Re: (Score:2)
Next step
A goatse detection algorithm?
I think that's stretching things a bit.
Re: (Score:2)
It's not rare [wikipedia.org]. However, what can be seen from that article is that (a), people aren't out there trying to meticulously dig it up using search engines, but are instead setting up massive filesharing websites to be used amongst themselves, and (b), the mention of "extensive encryption" in the article means most of the content is probably on Tor, Freenet, and similar services. In other words, what Google is doing will be absolutely useless to actually go after CP. But it will make them look good, look like the
Busytown (Score:2)
I'm more worried they'll go after the loli stuff.
If they went after Lowly Worm [wikipedia.org], that'd be Scarry.