Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Crime Google Privacy

Google Scans Gmail and Drive For Cartoons of Child Sexual Abuse (forbes.com) 146

A Forbes associate editor covering privacy, cybercrime, and security/surveillance reports on a recently-revealed search warrant.

Instead of investigating a photo, it asked Google to provide information on a suspect who allegedly owned graphic illegal cartoons involving children: That kind of content is potentially illegal to own under U.S. law and can be detected by Google's anti-child sexual material (CSAM) systems, a fact not previously discussed in the public domain, the warrant reveals.... Google also uses machine learning tools to look at files and analyze them for any sign they're of abused children....

As per its legal requirements, Google handed information on what it found, as well as the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then passed on the findings to the DHS Homeland Security Investigations unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving back information on emails to and from the defendant. It appears the suspect may actually be a known artist. As no charges have been filed, Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper...

Google, meanwhile, has in recent years released transparency reports showing how many times it reports issues to NCMEC. The figures reveal a disturbing trend. In the first six months of 2021, it found more than 3.4 million pieces of potentially illegal content in 410,000 separate reports. That was up from 2.9 million in 365,000 reports in the last six months of 2020, and well over double that from January to June 2020, when 1.5 million pieces of CSAM material were discovered and reported to NCMEC in 180,000 reports...

As Google doesn't end-to-end encrypt its communications tools like Gmail or its file storage tech like Drive, it's still possible for the tech company to scan for illegal content. And as it has no plans to introduce those features, law enforcement can still rely on Google to warn NCMEC when abuse happens on its servers. Whether the majority of users will want Google to scan people's accounts so it can help find child abusers, or have improved privacy with end-to-end encryption instead, the Mountain View, California-based business will have to struggle with that balance in perpetuity. The same goes for any one of its rivals.

This discussion has been archived. No new comments can be posted.

Google Scans Gmail and Drive For Cartoons of Child Sexual Abuse

Comments Filter:
  • by Anonymous Coward on Sunday December 26, 2021 @01:55PM (#62116943)
    any material the CCP finds "offensive" tomorrow...
  • (o)(o)

    Clearly illegal, anyone looking at this should go to jail.
  • thirty years ago (Score:5, Insightful)

    by way2trivial ( 601132 ) on Sunday December 26, 2021 @02:03PM (#62116969) Homepage Journal

    I had a cartoon of Bart Simpson getting a BJ from the purple headed twins, and other similar cartoons, maybe a dozen all told.
    Illegal today, because it depicts someone meant to be a minor.

    Am I going to have to worry it may be in one of my archives somewhere? jeez louise.

    • by IdanceNmyCar ( 7335658 ) on Sunday December 26, 2021 @02:17PM (#62117033)

      Yes... that's basically where this is going. I mean beastality is banned in most countries too. Does this mean watching some Beauty and the Beast animated porn should be illegal. I mean that's not my cup of tea but this is to really open a can of worms.

      This guy won art competitions, so I would even wager whatever he made was likely "tasteful" in the sense that there is a lot worse shit out there.

      This doesn't even get into "faces of death" kind of stuff which personally is probably a far worse taboo... I mean murder is illegal everywhere, should having a video of a murder be illegal because then CCTVs put you at risk. Shit a CCTV probably captures actual child porn and if it's backed up, that's another "tick" against you.

      So personally fuck this shit.

      • Re:thirty years ago (Score:4, Interesting)

        by AmiMoJo ( 196126 ) on Sunday December 26, 2021 @08:09PM (#62118013) Homepage Journal

        There are US based artists making sexual images of cartoon children right now. They have Patreon pages where they accept money.

        Something doesn't seem right here, either the law is not being accurately described or is often not enforced.

        • Patreon takes them down, just not in an automated fashion. If you were to publish the name of the account here, it wouldn't last long. My guess is that it's gone in less than a year anyway.

          My thing is furries, so I'm pretty sure I can calculate in dog years to determine if they are legal or not. Anything above 3 and I should be fine.

        • Re:thirty years ago (Score:5, Interesting)

          by mysidia ( 191772 ) on Sunday December 26, 2021 @11:37PM (#62118333)

          Something doesn't seem right here, either the law is not being accurately described or is often not enforced.

          It seems that the justification given for the search is Potentially Illegal, not Illegal.

          It is only explicitly illegal if CSAM depicts exploitation of a child - that could not exist in a cartoon involving entirely fictional characters that does not involve an actual person, But could be if one of the characters shown in their cartoon represents an actual person.

          They also wrote that no charges have been filed, But Google - apparently, is asked to find the images all the same -- It sounds like scope creep..
            What started as a system and method for finding images associated with actual abuse is being expanded to do more and more. Next thing you know it will become a more general search to report to law enforcement people with more uncertain or questionable materials (rather than definitively dubious items) for further study; this can easily expand to other categories of info.. too such as potentially-hateful imagery, "potentially-subversive, seditious documents, misinformation / propaganda, etc".

          • this can easily expand to other categories of info.. too such as potentially-hateful imagery, "potentially-subversive, seditious documents, misinformation / propaganda, etc".
            Given the political activism of the parent company, they're already scanning for political right/wrongthink and have a profile being kept on each person they have data on. The TOS allows it, they're in the business of selling intrusiveness and political profiles pay (you can predict what messaging will win elections, for example).
          • Tell GPT-5 to predict the response given your message history if someone asks for child porn. if gpt-5 predicts you can be convinced to provide it, you're guilty

            • by mysidia ( 191772 )

              if gpt-5 predicts you can be convinced to provide it, you're guilty

              First of all they will change CP to include any legally "obscene" materials - which is a broader more legally vague completely separate category of items that various states have laws against to varying degrees of constitutionality, and there have been convictions of people on obscenity charges over possessing some cartoons but not for solely obscenity, if that makes sense.

              More like.. Pre-Crime for the purpose of search justification. if

      • Re:thirty years ago (Score:4, Interesting)

        by MrL0G1C ( 867445 ) on Monday December 27, 2021 @05:24AM (#62118657) Journal

        I mean murder is illegal everywhere, should having a video of a murder be illegal

        You're understating things here, this is about cartoons so the question is: should a cartoon depicting murder be illegal, I think we can agree this is a high level of absurdity.

        I don't know why anyone would want to watch "faces of death", it's grotesque but I don't want to impinge upon other people freedom.

        What next, banning school-girl fantasy dress outfits? Now, why would women want to dress as school-girls anyway, is it because actually men fantasise about having sex with schoolgirls, of course it is. So should the st' trinians films all be banned, there is some obvious sexualization in those.

        Shitty laws like this have a name - thought crimes.

    • I had a cartoon of Bart Simpson getting a BJ from the purple headed twins

      Bart has been around since 1987, and he was ten years old then. So he is 44.

    • by AmiMoJo ( 196126 )

      Strange thing is there is that kind of porn easily discoverable with Google searches, and the sites hosting them don't seem to have legal issues.

      Is there any information on the criteria that an image must meet to get into the naughty list?

      • by MrL0G1C ( 867445 )

        The current law was foreshadowed in May 2008, when the Government announced plans to criminalise all non-realistic sexual images depicting under-18s

        People in the UK aged 16 can have sex but if they draw a cartoon of it afterwards then they can be thrown in prison for 3 years. Makes perfect sense doesn't it ~sarc.

    • Australian legal precedence set in 2008 with regard to that exact cartoon (quite a few people have been charged with child pornography for having it since): https://www.abc.net.au/news/20... [abc.net.au]

      Let that insanity sink in for a moment, this is a judge, explaining what is a 'person' under Australian law:

      "In my view, the magistrate was correct in determining that, in respect of both the Commonwealth and the NSW offences, the word 'person' included fictional or imaginary characters ... the mere fact that the figure

    • All those starfire/raven clips⦠they be jailbait now.
  • I would certainly like law enforcement to concentrate where kids may be in actual harms way, like the current situation with Alex Jones and his kids step mom who has been arrested. But I can also see that such depictions might be promoting criminal activity, such as Snow White might encourage a a group of old men to take in a 7 year old, or a 14 year old inthe Disney version.
    • Uh, is that sarcasm? Can't tell if you really want to ban Snow White but it's a good comparison for the issue.

      • Re:Complicated (Score:4, Insightful)

        by fermion ( 181285 ) on Sunday December 26, 2021 @02:29PM (#62117069) Homepage Journal
        To quote archer, you realize that is like every fairy tale. So, yes, as we subvert the traditional protection of fairy tales we are going to deal with the fact that they do, on the whole, teach lessons suitable to modern kids. The truth is if your parent throw you out, you will sell your body for food and drugs. The truth is if you get taken in by a gang, they will likely rape you, even at 7. The truth is marrying Prince Charming will be political and you will have to deal with a Madame De Pompadour who everyone will like better
        • Interesting points. Reminded me a bit of Akira which seems pretty clearly to have a scene of underage rape, though since we aren't told the kids ages, it can be left up to assumption they are 18+ (though maturity wise we might guess otherwise). Such a fairy tale does seem to have some of these "life lessons" potentially at reach to the audience.

  • by AlanObject ( 3603453 ) on Sunday December 26, 2021 @02:07PM (#62116979)

    I clicked on a few links looking for just how you write software to detect this type of material and came up pretty empty. They aren't talking about how it works.

    So just how does such a system discriminate between "guilty" and "innocent?" I would think that is far more difficult than Level 5 car automation.

    Given the numbers they do report -- in the millions -- I suspect their "AI" just deals out huge numbers of false positives for an organic unit to sort out. So they're just flooding a government agency with crap data which will keep them from getting real work on child abuse done. Or did I miss something on this?

    • don't take an deal and demand source code in court the state needs to prove in court that it is CP and if it's just an cartoon that may be very hard and if it's say 8bit / 16 bit art then you can point to video games that have stuff that look like what you have.

      and don't forget the Best Buy case in where an fruit of the poisonous tree issue killed the case.

      • Wouldn't take much work though... algorythm flags the images and informs the cops. Cops use the flag to get a search warrant. Before charges are filed, there's "legally" obtained images. Secondly. Google and apple are going to cover this, their terms and agreement for their cloud storage is almost certainly updated so that by using them you give them permission to send suspected abuse images to the authorities. At which point the cops don't need a warrant, the google's algorythm sent them the images, and by
    • There are organizations that provide a list of known CSAM images - either government, or in partnership with government. I don't know about the US, but the UK has the IWF. Since these are perceptual hashes though, false positives are inevitable - and common.

      I suspect the machine learning part is something of a brag. Hard to say though, as the whole field is really secretive. If they actually said how they find the material, it would be hinting at what countermeasures might hide it. Also some of the organisa

    • ... system discriminate between "guilty" and "innocent?"

      Create a database of 'offending' images and compare all files to it. This resulted in photos of a vietnamese female burnt by napalm being declared "sexually abused" and copies of the war-crime, censored.

      Use AI to identify female areolas and male glans (purple knobs). Anything that matches the pattern is banned.

      Or did I miss something on this?

      Everyone can claim "I'm really concerned and doing something".

    • by AmiMoJo ( 196126 )

      They have databases of image hashes.

      Most of the reports are of websites. My guess would be that they first check if they are US based, otherwise no jurisdiction and possibly not illegal in the country in which they are based.

      Google probably helped with the tech. They have an image hashing system that can recognise things like resized images, and copies that have been through the JPEG process a few times. Hopefully a human checks everything before any allegations are made.

    • As no charges have been filed, Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper...

      Fortunately we're not bound by such considerations and can name the child pornographer as one Raffaello Sanzio da Urbino. Here's a sample of his work [aaronartprints.org]. Warning: Possessing or viewing that may be illegal in some US states.

  • by renegade600 ( 204461 ) on Sunday December 26, 2021 @02:16PM (#62117027)

    guess it means anime fanservice and hentai must also be illegal in the US

  • by Rosco P. Coltrane ( 209368 ) on Sunday December 26, 2021 @02:17PM (#62117035)

    You think a few drawings and animes hurt no-one. What about the toons who were kidnapped, trafficked and forced to shoot sick perverted cartoons? Scarred for life, that's what they are. Some of them ended up so mentally destabilized they even agreed to work for Disney...

  • Anyone who uploads non-public data to the cloud without first encrypting it is a fool.

    I use Google Drive for two purposes: to publish stuff I want everyone to read, and as off-site backup for my most valuable data. Before uploading my backups I encrypt them. The key is not known to Google, so Google can't snoop on them. When I download the backup data I decrypt it myself.

    Think of the cloud as "somebody else's computer". Any data you wouldn't give to a stranger is data you should't be uploading to the cloud.

    • by sinij ( 911942 ) on Sunday December 26, 2021 @03:23PM (#62117251)
      Average non-tech person cannot tell what is cloud and what is local and with Google pushing cloud, it defaults to cloud.
      • Agree - and that extends to reasonably competent people. I use computers heavily in my work and am generally computer literate, but a couple of times have discovered that some files were linked to the "cloud" without my intentionally having done so - probably due to my clicking through endless "ok" boxes without reading carefully enough
        • There are institutions, both government and private, that are serious about computer security. How do they make sure their secrets are not accidently exfiltrated? Perhaps they have a set of "best practices" that could be useful to those of us with small budgets.

          • From what I've seen: Classified data is kept on completely physically isolated networks. (this may not always be true but it was true in the cases I know about). Large companies with sensitive data using windows use windows enterprise (which individuals can't reasonably buy) which allows computers to be managed such that users cannot install anything. This also allows control over what cloud services are used. Those companies usually require all company information to be on a company provided and contro
            • Keeping important data on physically isolated networks requires discipline. You have to make sure nobody bridges the gap for his convenience, either by order from the top ("Give me full access to everything or you're fired") or by planting a wi-fi hot spot somewhere. I can see how the military, or a company run by veterans, could have such discipline, but I expect it is rare among the private sector.

              • I think if you are found to have loaded classified data onto a portable storage device, or set up a wireless network, being fired is the least of your worries. Those are very likely felonies - and the people working with classified materials understand the consequences. This is true for private sector defense contractors - like the big weapons companies.
      • by lsllll ( 830002 )
        Fuck. I'm a techie and even I can't tell what data I have on Google's cloud because I use an Android phone with a Google account.
    • G-suite services like Photos are already scanning local files on Android devices for advertising purposes. Scanning for "bad pictures" is the next step. (This was the same path they took in regards to scanning your emails.) After seeing the backlash Apple's program generated, they are not likely to publicize the fact that they are doing this.

      Besides that, the trajectory of how consumer tech works is increasingly to upload your data to "the cloud" by default. Even if you carefully opt-out from that at every

      • Rolling your own smaltphone doesn't seem reasonable unless you have a very large IT budget. For those of us without such resources we can aviod using smartphones, or not put any secrets on them. As far as Windows is concerned, using a distribution of GNU/Linux doesn't seem unreasonable. I am not into computer games, and I don't know of anything else I had to give up when I started using Fedora. My last application switch was from Quicken to GNUcash.

  • Before you get upset, you should remember that Gmail/Google Drive are free services to just about everyone. If you're paying for it then sure, be upset but you bought in after knowing the ToS and the recent behavior of Alphabet/Google. Everyone that doesn't like this is free to go to other services... and there are no shortage of them.

    I do value my privacy so I use ProtonMail. It's not perfect but it's good. For storage I built a small NAS.

    • For storage I built a small NAS.

      Was it a Lil Nas X?

      Okay, I'll show myself out...

    • Account required to use your phone; phone data synced to cloud as part of OS update without explicit user consent or notification - both phone OS companies have done that.
      Users have to be technically vigilant to have a chance of stopping it, if it's possible at all.
      • If you're using Android or iOS then you've already failed at privacy. Also, it's 100% possible to live life without a smart phone.

  • How can you prosecute someone for posessing a picture? Isn't a picture "expression", and isn't expression protected by the First Amendment?

    The workaround to the First Amendment used by the child porn law is that a picture of child porn is considered evidence of a child having been molested. Law enforcement has a database of known child porn pictures, and if you are caught with one of them you can be prosecuted.

    However, what if I create a picture, purely from my imagination, that depicts a child being molested? That isn't evidence of anything, so if I am prosecuted for posessing it can I use the fact that I created the picture from my imagination in my defense? Would it help if I demonstrated to the court my ability to create a picture of a child being molested from my imagination?

    A cartoon is even more problematic than a photo-realistic picture. It is probably easy for a skilled graphic artist to demonstrate to a court that he can draw a picture of a child being molested, and therefore such a drawing is not, by itself, evidence of anything.

    • by bill_mcgonigle ( 4333 ) * on Sunday December 26, 2021 @05:13PM (#62117559) Homepage Journal

      If some sick fuck has cartoon drawing of depraved shit, there's no victim, so no common law crime.

      Does it encourage him or give him an outlet that keeps him off the street?

      People used to have this discussion about violent video games but it's mostly been settled that they reduce overall crime.

      • I don't think people are arguing about depraved shit, it's the vagaries that are the problem.

        If I draw two stick figures having sex, at what point does some algorithm (would this be a TipperGorithm?) find it pedophilic and thus obscene and destroy my public reputation etc?
        When one stick person is only 2/3 the height of the other? Half? Is it possible to draw people having sex with, say, a little person at all? Or a fantasy pic of a slutty hobbit chick?

  • Experts for the age of cartoon characters must be paid a lot then.

  • by joe_frisch ( 1366229 ) on Sunday December 26, 2021 @07:34PM (#62117919)
    Drawings, stories, movies, graphic novels and especially history books talk about a wide range of horrible actions. Some of these are informative, some for entertainment. We allow these because we realized that writing a story about a serial killer is not the same thing as being a serial killer. There doesn't seem to be much doubt about that.

    There was an argument that porn that looked like child porn, even if the actors will all >18 might be used to camouflage real child porn. I don't agree with that argument, but it is not irrational.

    But what is the argument that a drawing of a child is hurting anyone? And if a drawing is illegal, why is that so different from a story? If someone draws a picture of a naked Romeo and Juliette, is that somehow far more damaging to society than the story itself?

    I'm not in favor of scanning of online files for any reason - the risks of false positives, of releasing private information etc, are far too high, even if the scans are looking for real and serious harm. But this?

    The argument not to store anything unencrypted in the cloud is good - but its easy to get it wrong. Various cloud services will try to entice you to put files in the cloud and a non-expert could get confused. At one point I discovered Microsoft was storing my desktop in the cloud - without my knowingly having given permission. (I must have missed an OK box somewhere). So good practice, but easy to get wrong.
    • by gweihir ( 88907 ) on Sunday December 26, 2021 @08:07PM (#62118007)

      Simple: This is not about protecting children at all. Once you accept that this is only and exclusively about inflicting some arbitrary moral ideas on everybody with as much violence as needed, you can see why drawings are treated the same as actual images and why they seem to care much more about images than about actually stopping child abuse. That is simply not their goal. That is just the lie used to facilitate their actual goals.

      • by djinn6 ( 1868030 ) on Sunday December 26, 2021 @09:47PM (#62118205)

        Their goal is to have a hammer they can use against anyone at any time. Even if the target is not prosecuted, once such information becomes public, they're likely to lose their job, membership in their community, friends and possibly family. I guarantee you, the CIA has enough zero-days to get a "bad" image onto your computer no matter how secure you think you are.

        • Its not a very broadly applicable hammer. Most people are not going to have even cartoon CP on their computers. Wouldn't copyrighted material, or some other more common violation be a much more effective weapon? They could just plant it, but in that case might as well plant real CP.
        • by narcc ( 412956 )

          Their goal is to have a hammer they can use against anyone at any time.

          Umm... Normal people don't have child pornography on their computer, cartoon or otherwise.

          I guarantee you, the CIA has enough zero-days to get a "bad" image onto your computer no matter how secure you think you are.

          Is that what you're going to tell the judge?

          • by Kokuyo ( 549451 )

            Amusing how you approach your parent poster with derision while immediately making his point.

          • There are plenty of cases where authorities were caught on camera throwing down drugs, even though it's rare for a camera to show just the right spot to make that possible. (i.e. most of the time throw down drugs are used, there is no video proof of that)

            It'd be oh so much easier to seize consumer electronics and add something incriminating prior to the 'official' forensics. No zero day is needed, just local folks in the chain of custody. And it's so much safer! If there is a slip-up and evidence tamperin
  • Seriously, fuck this world
  • Google Scan Gamil and Drive for Cartoons of Child Sexual Abuse ....So they can send them to their friends at CNN.
  • So if I am a lousy artist, and make drawings of a similar style as the xkcd characters (or worst) and label them as children, and have my stick figures depicting "sort of" stick figure porn, is that child porn?

    After all, you got all sorts of art, with drawings that don't look like anything to the very realistic paintings.

    Does all it takes to make it child porn for the artist (or someone else) to say it depicts child abuse / child porn?

  • I understand real photo's to be illegal, but I draw the line when it is a cartoon or fake digital representation as it doesn't hurt anybody, and it might even prevent actual abuse. We aren't living in the dark ages anymore.. Again, as long as real children aren't abused/used I realy don't give a rats ass if people get off on cartoons/cgi, it's all fantasy and people kill/abuse more than enough in video games etc which are still perfectly legal (and should be if it is up to me).

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...