Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Crime Google

Google Helps Police With Child Porn WebCrawler (siliconbeat.com) 113

The San Jose Mercury News is reporting that the Internet Watch Foundation, "an organization that works with police worldwide to remove images of child sexual abuse from the Internet, has credited Google with helping it develop a 'Web crawler' that finds child pornography." The pilot project makes it easier to identify and remove every copy of specific images online, and the group says "We look forward to the next phase of the Googler in Residence project in 2016." Last year Google also had an engineer working directly with the foundation, and the group's annual report says "This was just one part of the engineering support Google gave us in 2015." [PDF] Their report adds that the new technology "should block thousands of their illegal images from being viewed on the Internet."
This discussion has been archived. No new comments can be posted.

Google Helps Police With Child Porn WebCrawler

Comments Filter:
  • by Anonymous Coward

    Why not crawl peoples Android phones and Android tablets for illegal images?

    Google already have access to the media folders on them via Google Play Services, so it trivial for them to take a look for other reasons.

    • by Anonymous Coward

      Because no one would trust Android ever again after that. Its a death sentence for the OS. If the crawler finds public child pornography, its fair game. Unfortunately Child Pornography consumers don't make that stuff public.

    • by DarkOx ( 621550 )

      Because that would open a can of worms nobody really wants opened: Is the picture dad just took of mom bathing their new infant for the first time in the kitchen sink CP? If he does it with his smart phone and Google's algorithm spots it a decision would have to be made by someone some place. Even if Google's algorithm had an 'exclude family members rule' that could turn ugly for Google the first time they failed to report some actual abuse.

      • Because that would open a can of worms nobody really wants opened:

        Ah, naivete.

        Even if Google's algorithm had an 'exclude family members rule' that could turn ugly for Google the first time they failed to report some actual abuse.

        This is just the first step. There is always a next step. By helping these witch hunters, Google is preparing us for the next step, intentionally or not.

      • Is the picture dad just took of mom bathing their new infant for the first time in the kitchen sink CP?

        FYI, that has been prosecuted as CP before.

        https://jonathanturley.org/200... [jonathanturley.org]

    • I wonder exactly what they use for matching. It's not like most pictures of a kid in a swimsuit are CP, nor even nude pictures. Most likely they're going from a database of existing and known images or possibly of "similar" images where they have flagged facial recognition etc etc.

      Beyond that, I suppose something obvious like a "full spread" might allow matching of genitalia and a match against body size, skin pigment etc might yield some results, though it might also catch some odd stuff like midgets or j

      • I wonder exactly what they use for matching. It's not like most pictures of a kid in a swimsuit are CP,

        It is probably as simple as an MD5 sum, but could include simple image matching. From TFA:

        The project, which a foundation spokeswoman described as a pilot, facilitates removal of all circulating copies of particular abuse images.

        In other words, they have an image that is already known to be CP, and they're looking for other places where it is available on the web. It's not a hard problem to do a checksum and then visually examine anything that matches. It won't catch your "picture of your kid in a swimsu

        • by phorm ( 591458 )

          I'd imagine that it has to be a bit more complicated than that though, otherwise a single pixel of difference is all that breaks the checksum. I suppose it might be "reduce the image to a simple form" (reduced color depth, size, etc) and then checksum, but that would consume an awful lot of computing resources.

          • I'd imagine that it has to be a bit more complicated than that though, otherwise a single pixel of difference is all that breaks the checksum.

            A single pixel change is unlikely, but that's why I also said it probably included some image matching. There are simple ways of comparing images that would give good results.

            I suppose it might be "reduce the image to a simple form" (reduced color depth, size, etc) and then checksum, but that would consume an awful lot of computing resources.

            That would be a waste.

      • It's not like most pictures of a kid in a swimsuit are CP, nor even nude pictures.

        You would be surprised what is considered CP sometimes.

        https://jonathanturley.org/200... [jonathanturley.org]

  • by Anonymous Coward

    and everyone else loses: No child is better protected, but content policing is now a free-for-all.

    Thanks, you Bwittish guise, you.

  • by SuricouRaven ( 1897204 ) on Sunday May 01, 2016 @02:57AM (#52021599)

    The IWB is a well-intentioned organisation, but they have no accountability whatsoever. They publish a list of links they claim are child abuse imagery, and ISPs block what's on the list - but the list, for obvious reasons, is super-secret. The processes by which the list is generated is also secret - even those who are put on the list are not informed that they are now on the list. Some (not all) ISPs actively try to prevent those who are censored from finding out by spoofing 404 error page rather than explaining that a deliberate block is in place - they certainly aren't going to contact the site operator. Even if someone wrongly blocked finds out (as happened with Wikipedia only because the block process inadvertently broke the site) there is no appeals process in place. That's a lot of power for an unaccountable and opaque organisation.

    • by EmeraldBot ( 3513925 ) on Sunday May 01, 2016 @03:39AM (#52021671)

      The IWB is a well-intentioned organisation, but they have no accountability whatsoever. They publish a list of links they claim are child abuse imagery, and ISPs block what's on the list - but the list, for obvious reasons, is super-secret. The processes by which the list is generated is also secret - even those who are put on the list are not informed that they are now on the list. Some (not all) ISPs actively try to prevent those who are censored from finding out by spoofing 404 error page rather than explaining that a deliberate block is in place - they certainly aren't going to contact the site operator. Even if someone wrongly blocked finds out (as happened with Wikipedia only because the block process inadvertently broke the site) there is no appeals process in place. That's a lot of power for an unaccountable and opaque organisation.

      This is my primary concern as well. Child pornography is something that should be prevented, but people are going overboard with this - it's in the same vein as the war on terrorism. Child pornography is definitely despicable, but most of the efforts against it are either extremely creepy - such as this, handing over power to an almost completely unknown organization - or evoke incredible amounts of self-righteousness, especially when people start accusing each other of this crime without any proof. Between the overreach of trying to stop, it's hard to say you support, especially when the countries most against it consume almost all of it [huffingtonpost.com].

      Furthermore, the approach we use today is fundamentally flawed. Currently, we try to block all images of it, but we can only target those existence that we know of - and even then, it's trivial to add an extra byte here and there to through of the checksumming. This creates a drive to make more of it, which more people get, before that too gets blocked. It's very profitable for these businesses and only encourages the cycle, so with all these programs in effect we're making the problem worse and worse. Most shockingly of all, when you legalize child porn, rates of it actually go down [springer.com], and sex abuse goes much farther down. Given what we know about ancient societies, where children also engaged in sex and didn't show any signs of being traumatized, it's a really hard issue to grasp, because all of the morals we grew up with are being disproved by numbers. If it weren't for the fact I'd be put on a government watchlist for the rest of my life, I might even suggest that perhaps the issue is more complex than we think.

      • by Linsaran ( 728833 ) on Sunday May 01, 2016 @04:04AM (#52021717) Homepage

        If it weren't for the fact I'd be put on a government watchlist for the rest of my life, I might even suggest that perhaps the issue is more complex than we think.

        Like almost everything, the issue IS more complex than we think. Drugs, for profit prisons, whether or not 'hitting your kids' is acceptable. You name a topic and I'm sure I can come up with a half dozen different sides to it. As for the government watch lists, I'm sure we're both on a couple dozen already. There's just the matter of 'is this an issue people care about right now'.

      • by AmiMoJo ( 196126 )

        The proposed Snooper's Charter and GCHQ spying is already driving more and more people to use VPN services for privacy. VPN services usually don't subscribe to the IWF's block list.

        The more you tighten your grip, the more star systems will slip through your fingers

      • by AK Marc ( 707885 )

        Child pornography is something that should be prevented,

        Why? Given that the definition almost everywhere now seems to include simulated child porn that didn't involve an actual child, the issue is obviously not about child-harm. The studies show that a pedophile with access to CP offends less, because they have an outlet. So, in actual fact, the only reason to ban CP is because someone finds it icky. Once icky is the only requirement, lots of things get put on the list, being gay, being a jew, whatever.

        • by swb ( 14022 )

          The problem with your analogy is that being Jewish or Gay isn't inherently icky. Those states of being aren't inherently harmful to anyone and being Jewish or Gay implies a kind of free association among like minded adults to engage in those activities.

          Is there any way you can justify the practice of pedophilia outside of simulated imagery? Even sharing such images seems as if it encourages and reinforces impulses which may encourage acting out those impulses, acts which do not involve the free associatio

          • by AK Marc ( 707885 )

            If a person collects simulated imagery of lynching black people, can you justify it by stating that racists are less likely to lynch people if they have a safe outlet for enjoying lynchings?

            Have there been studies done that show that access to pictures of lynchings reduces the numbers of lynchings? You are making another emotional appeal, rather than sticking to facts.

            The problem with your analogy is that being Jewish or Gay isn't inherently icky.

            Then why do I hear so much about gay being icky? Lots of people think it is. That's one of the reasons why gay marriage is under attack. The very thought of it disturbs some people, about as much as child porn disturbs others.

            Or are we appointing you the emperor of moral standards, so you can decree what is and is not icky,

          • The problem with your analogy is that being Jewish or Gay isn't inherently icky.

            An assortment of cultures have had sex with people younger than we claim to be inherently icky, as a matter of course. Maybe it's not inherent. Maybe we make it up. (Maybe that's a good idea, that is another argument — the point is that it is somewhat arbitrary.) I was having sex with a 20 year old when I was 15. I pursued her, she wasn't initially interested. Did that warp my fragile little mind? I submit that it was warped already.

            • Including the historical predecessor to the western culture which most posters here reside in. It wasn't until the 1800s that a movement to raise the age of consent began to spread, and it was driven in large part by a desire to eliminate prostitution - prior to that the legal age of consent in most countries was somewhere between ten and thirteen, and the social standard was that menstruation was the signal a girl had reached adulthood and was ready to be married. Most parents would be very eager to get th

      • by mikaere ( 748605 )
        . Most shockingly of all, when you legalize child porn, rates of it actually go down [springer.com], and sex abuse goes much farther down. You fucking moron. The study you linked to was around legal, consenting, adult porn vs general sexual assault. That is, watching some people *legally* and most importantly *consenting* to both the sexual activity and its recording, is not the same as watching sexual assault. i.e. no victims.

        Contrast this with child sexual assault material (aka child porn). The child
        • If the subject of the image is unaware the image has been viewed again, how can they possibly be harmed by the act?

          You have, though, provided a good demonstration of why debate on the subject is so difficult: The socially acceptable view is that child porn is inherently evil and anyone who seeks to watch it is a monster who needs to be locked up or destroyed. This view is so strong that to question is in public is to risk one's reputation and one's career. I personally regard it as a medical issue: No perso

  • by Hognoxious ( 631665 ) on Sunday May 01, 2016 @03:54AM (#52021701) Homepage Journal

    1st Googletard: If anyone finds this there'll be big trouble.

    2nd Googletard: I could say we were doing it as a hypothetical exercise or something.

    3rd Googletard: Won't fly. The first thing they'll ask is why we didn't choose another subject. *Any* other subject.

    1st Googletard: We, paleface? And why didn't you, by the way?

    2nd Googletard: Ummmm....

    3rd Googletard: How about ... we were doing it to help the police?

    2nd Googletard: Froppwoppalattes all round!

  • by Anonymous Coward

    As a web developer with some sites that let users upload files, why can't I get access to the hash table that they're using? Obviously the big players like Google probably have access to a black box of confirmed real images supplied by whatever agency and are using their using their GIS algorithms to find resizes or crops/edits, but I'd be absolutely shocked if the police don't have a list of hash files that they use to scan suspects' hard drives to find any images.

    I want access to that hash list so I can i

  • by Required Snark ( 1702878 ) on Sunday May 01, 2016 @04:36AM (#52021781)
    1984 Winston Smith [wikipedia.org]

    Winston Smith works as a clerk in the Records Department of the Ministry of Truth, where his job is to rewrite historical documents so they match the constantly changing current party line. This involves revising newspaper articles and doctoring photographs—mostly to remove "unpersons," people who have fallen foul of the party.

    Orwell couldn't conceive that Winston would be automated out of a job. That may be the only part he got wrong.

  • "Their report adds that the new technology "should block thousands of their illegal images from being viewed on the Internet."

    Thousands....out of what are probably millions if not billions of images?

    Don't get me wrong, I'm absolutely 100% against the sexual exploitation of children, but this seems like less than a drop in the bucket.

    We routinely read about police finding people with tens or hundreds of thousands of child porn images on their computer, how does blocking a few thousand make any significant d

    • by dgatwood ( 11270 )

      Ironically, the possession laws are the sole reason why it isn't millions or billions. IMO, if you really want to get tough on harm to children, it has to be legal for people to A. submit photos that they think might be child porn, so that authorities can investigate them, and B. submit photos and metadata to a server to determine whether it is known child porn, with a guarantee that nobody will try to track you down and send you to jail or confiscate your equipment merely for checking a photo that you ha

  • My biggest issue with groups or governments doing this is they are hiding evidence of a crime. This allows groups to sound outraged, pretend they are doing something about a problem and claim that they have reduced CP while safely ignoring that fact that they have done nothing to prevent child abuse and have quite possible made it worse.

    Why does this sound like the same approach used for issues like prostitution vs rape and human trafficking or drug use vs mental health?

"Confound these ancestors.... They've stolen our best ideas!" - Ben Jonson

Working...