Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Cloud Apple

Apple's Anti-Fraud Chief Said Company Was 'The Greatest Platform For Distributing Child Porn' (9to5mac.com) 74

An explanation for Apple's controversial decision to begin scanning iPhones for CSAM has been found in a 2020 statement by Apple's anti-fraud chief. Eric Friedman stated, in so many words, that "we are the greatest platform for distributing child porn." The revelation does, however, raise the question: How could Apple have known this if it wasn't scanning iCloud accounts...? 9to5Mac reports: The iMessage thread was spotted by the Verge as it works its way through the internal emails, messages, and other materials handed over by Apple as part of the discovery process in the Epic Games lawsuit. Ironically, Friedman actually suggests that Facebook does a better job of detecting it than Apple did: "The spotlight at Facebook etc. is all on trust and safety (fake accounts, etc). In privacy, they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc."

A fellow exec queries this, asking whether it can really be true: "Really? I mean, is there a lot of this in our ecosystem? I thought there were even more opportunities for bad actors on other file sharing systems." Friedman responds with the single word, "Yes." The document is unsurprisingly labeled "Highly confidential -- attorneys' eyes only."

The stunning revelation may well be explained by the fact that iCloud photo storage is on by default, even if it's just the paltry 5GB the company gives everyone as standard. This means the service may be the most-used cloud service for photos -- in contrast to competing ones where users have to opt in. Apple has said that it has been looking at the CSAM problem for some time, and was trying to figure out a privacy-protecting way to detect it. It may well be this specific conversation that led the company to prioritize these efforts.

This discussion has been archived. No new comments can be posted.

Apple's Anti-Fraud Chief Said Company Was 'The Greatest Platform For Distributing Child Porn'

Comments Filter:
  • by iamnotx0r ( 7683968 ) on Friday August 20, 2021 @09:33PM (#61713527)
    The privacy was a way apple was sort of unique. Illegal stuff is going to happen, but the legal desire for privacy should have been Apples concern. A slippery slope they are going down. And any one that gives me s^&$* on children, can go and f^%@ off. I am talking about someone is deciding when you get privacy. It is something we did not vote on. Companies are going to rule us.
    • by Tyler Watson ( 8553891 ) on Friday August 20, 2021 @09:42PM (#61713557)
      I am very curious as to the mechanism of distribution.

      Who buys, who sells, and how is it that this wasn't public before?

      Every time I hear child pornography is being disseminated on platform X, I research it and find no evidence it's true.

      It is at the point I think there really is no child pornography on the internet. If you did find it, you would not be able to report it, but if you didn't find its existence, it is OK.
      • That is a question I have as well. How did this exec know Apple's platform has a lot of child pornography? Really, it should be incidental reporting, not drag-netting. If you take your phone to Apple for repair, and in that process you give them access to your phone and they stumble upon it, they should report you. They should not be systemically searching through your private data for it.
        • by Kitkoan ( 1719118 ) on Friday August 20, 2021 @10:41PM (#61713671)
          Most likely Apple has been already scanning iPhones for this kind of data.

          We already know that Apple has already included the "NeuralHash" into iPhones before iOS 15.
          https://apple.slashdot.org/story/21/08/18/1755223/apples-neuralhash-algorithm-has-been-reverse-engineered

          And most people are seeming to forget/ignore that Apple has also admitted that it's not just iCloud that is scanning your photo's, but iMessage is scanning images as well.
          https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life (Basically they are claiming that it will only trigger if the user account is flagged as under 13, and it scans the image to see if it's explicate, not if it matches a known hash. But it is clear that iMessages is scanning your pictures you send.)

          It wouldn't be a huge step to thing that Apple already decided to take the first steps to scan all of this ahead of time to see if it's a big enough issue to develop these exact tools that they are now making public. And when they saw, yes, it's a huge issue, then made these tools to help fight it (before it went public from a thrid party as "iPhone, the phone of pedo's worldwide")
          • Most likely Apple has been already scanning iPhones for this kind of data.

            Most likely, Apple has been involved in court cases. The folks who enjoy masturbation while looking at images of prepubescent children are sometimes caught, and law enforcement is curious regarding where they get and store their masturbatory material.

            So in the desire to avoid lengthier prison terms, and given the fact that pedophiles are not terribly popular in prison, they give up the goods on where they store and access their images.

            While Slashdotters freak about privacy and therefore support the purv

            • While Slashdotters freak about privacy and therefore support the purveyors of Kiddie pR0n, law enforcement has probably been in contact with Apple.

              'Has probably been'? Try 'is in daily contact with'. And what does this have to do with your stance? We know Apple fields thousands of LEO requests per year. What bearing should that have on privacy?

              Seriously - if it is such a concern to you, provide the kiddie diddlers with free space on your computer. Privacy demands it, amirite?

              Aaaaand you've lost us completely. What argument are you trying to make, exactly? That Apple are correct in opening up their platforms to rampant abuse by authorities around the globe with a horribly-broken scheme that will cripple the privacy of all their users because... kid touchers need storage? Or something

          • You're talking about future releases, not current functionality. iMessage WILL scan images. IF you turn it on AND the child is under 13.

            How did they know BEFORE that they're a platform for distribution? It's hard to say. They currently hold a backup key to your encrypted iCloud backups just in case you lose yours...but also so they can hand it over to law enforcement. It might be possible that many of the backups that have been decrypted for cops have contained this kind of media.

        • by raymorris ( 2726007 ) on Friday August 20, 2021 @11:33PM (#61713735) Journal

          > How did this exec know Apple's platform has a lot of child pornography? Really, it should be incidental reporting, not drag-netting. If you take your phone to Apple for repair, and in that process you give them access to your phone and they stumble upon it, they should report you.

          And when someone reports child porn on an iPhone, the FBI is going to want some information about whose phone it is. When they find some porn on the phone itself, the FBI is going to want to see what else the perpetrator has on icloud and other Apple services. So they're going to contact Apple, either informally ot with a warrant. *Someone* at Apple is going to know about it. So yeah, there will be an executive who knows, someone who is the authorized contact for the FBI, for example.

          Additionally, there are a couple of different teams at Apple responsible for different aspects of security. A few of those people are going to have accounts on the "dark web" hacker forums and such. They are going to hear things. I don't even specialize in dark web stuff myself and just doing routine security work I've come across stuff I'd rather not know about. When your job is watching out for bad guys, sometimes you see some stuff related to bad guys.

          Add to that, some of the sickos are going to screw up and give themselves away. Doing things like accidentally marking an album as public. Apple is going to become aware when they're hosting public albums full of CP. That gives an indication. Kinda like when you want to know what Americans think about any particular question, you can poll 0.1% of Americans (or less) and apply statistics to learn something about what Americans in general think. Have a few CP sickos out themselves and you start to learn something about sickos.

          So yeah, there will be a few people at Apple who know.

          • "How did this exec know Apple's platform has a lot of child pornography? Really, it should be incidental reporting, not drag-netting. If you take your phone to Apple for repair, and in that process you give them access to your phone and they stumble upon it, they should report you."

            Unless, of course, You're the son of a presidential candidate, asst which point half the electorate and 90% of the media will go to the mat insisting that whatever is on that hard drive was a Russian plot.

        • by kmoser ( 1469707 ) on Saturday August 21, 2021 @02:01AM (#61713901)

          The revelation does, however, raise the question: How could Apple have known this if it wasn't scanning iCloud accounts...?

          Apple just said the platform was great for distributing porn, not that they had any evidence of it having been used for that. It's like saying "my pickup truck is a great vehicle for transporting bricks." That doesn't mean I've actually done so.

          • What Eric Friedman in "B2B Account Management" at Apple really meant is: we will be perceived as the biggest platform for child porn. The guy is a salesman.

            The actual biggest platform would be some site on Tor or similar, which you've never heard of. There was a massive one that got busted a few months back. The pictures come from there, and sometimes end up on an iPhone when someone wants to jerk it in the bathroom.

        • by Reeses ( 5069 )

          How did this exec know Apple's platform has a lot of child pornography? Really, it should be incidental reporting, not drag-netting. If you take your phone to Apple for repair, and in that process you give them access to your phone and they stumble upon it, they should report you.

          At first guess, I'd imagine the FBI is going to Apple and telling them how many child porn pics they're coming across with iPhone-related info in the EXIF data. It's not like these folks are taking the effort to roll out the DSLR to take pics.

        • That is a question I have as well. How did this exec know Apple's platform has a lot of child pornography?

          Apple, like all the other companies, gets served warrants. They turn over the photos in the iCloud Photo Library. It's encrypted but Apple currently has the key

          This new thing, which does the checking before uploading, may well be a precursor to them being able to encrypt the iCloud library in a way that they can't read it.

          The laws are coming, especially in Europe. This on-device checking might let them both adhere to the law and user-encrypt the library.

        • That is a question I have as well. How did this exec know Apple's platform has a lot of child pornography? Really, it should be incidental reporting, not drag-netting. If you take your phone to Apple for repair, and in that process you give them access to your phone and they stumble upon it, they should report you. They should not be systemically searching through your private data for it.

          You can scan files for hash values without seeing the contents of those files. Google does it all the time.

          What's your thoughts about denying scanning for hash values of classified data? Forbidden?

        • by aurizon ( 122550 )

          Yes, the weevil problem, which is the lesser. Within the 5 gig limit, plenty to save, send and delete as the traffickers have learned to delete from clouds etc. There have been many lawsuits where porn was found on various storage methods, by carelessness as well as police action. I am sure this method will fail. A simple app to trim any photo by 0.1% will change the hash and it will not trigger anything. So it will ensnare few. Newly taken pix will not be listed, volume traffickers know what to do.

      • >Who buys, who sells

        Almost no one does either of these things. Most of the "distribution" they talk about is pedos using peer to peer to get their CP fix.
      • It is at the point I think there really is no child pornography on the internet. If you did find it, you would not be able to report it, but if you didn't find its existence, it is OK.

        You can literally go looking for normal porn with consenting adults and get CP.

        • That sounds sketchy as hell, CP is illegal in most jurisdictions and no player in the adult industry would want to be caught peddling that shit.
          • It happens on file sharing networks a whole lot, not so much on commercial sites although that has also happened where users can upload direct to public.

      • "It is at the point I think there really is no child pornography on the internet." You new here or something?
    • Privacy attracts them that need it.

      No mod points today, but this deserves one.
      Any platform offering good privacy is going to attract criminals. Should you do something about that? If you do, you are compromising that privacy; no two ways about it. Apple is trying, and it took exactly no time at all for securty experts to poke holes in it and point out the many ways in which this was a bad idea.

      It's a disappointing move on Apple's part, but not more than that. What scares me more is the increasingly popular notion that there is no such

      • What scares me more is the increasingly popular notion that there is no such thing as a "legal desire for privacy", that those who desire privacy have something bad to hide.

        The sad reality may be there truly are no "legal desire for privacy". When one worry about privacy, one's major concern is usually the Big Brother. And hiding from Big Brother is by definition "illegal", as Big Brother wouldn't be Big Brother if they don't decide the law.

    • The privacy was a way apple was sort of unique. Illegal stuff is going to happen, but the legal desire for privacy should have been Apples concern. A slippery slope they are going down. And any one that gives me s^&$* on children, can go and f^%@ off. I am talking about someone is deciding when you get privacy. It is something we did not vote on. Companies are going to rule us.

      Why don't you buck the trend, and set up your own server farm, and say "We respect your privacy, and don't monitor anything you place on our servers!" Your files, your way.

      You'll have a group flocking to you who are really, really interested in having no one but other people who are really really interested in having themselves and like minded people looking at their files.

      Won't someone think of the pedophiles?!

    • This is slashdot, you can say shit and fuck.

    • My take on this is that Mega (formerly MegaUpload) is probably the biggest platform for storing and distributing illegal child porn.

      Why? Because it's end-to-end encrypted and there's no way for LEA to know what kind of files people are storing on it. There's also no judge who'll convict its owners on the count of liability. Kim Dotcom and his investors are therefore safe from prosecution, no matter how much CP is being stored there.

      But the fact is, we DON'T really know how much CP is stored on Mega be
  • Weird flex but OK.

    • Weird "flexes" like this are done when the "flexer" doesnt want people to actually think about the subject in a context other than the one presented.

      They wouldnt want you to wonder why Apple users in particular are prone to pedophilia, nor would they want you to examine the demographics of iPhone users trying to piece it together.

      Everything to do with Pedophilia goes down exactly like this. Lets talk about he particulars of Epsteins death, instead of the passenger manifests of his lolita express.
      Lets t
      • Lets talk about Apple scanning your photos instead of a particular demographic of iPhone users clearly being massive fucking pedos.

        If I've learned anything then it's that variation in the human brain is a dispersed throughout humanity without regard to class or race. What this means is the real difference here is accessibility/opportunity. Crimes of opportunity are the most popular cause of crime.

  • by Anonymous Coward

    Apple has said that it has been looking at the CSAM problem for some time, and was trying to figure out a privacy-protecting way to detect it.

    The problem is that that is completely nonsensical, which must have been apparent to them all along. You can't have a "'privacy protecting" way to scan everyone's files.

    It comes across as just having two competing marketing objectives - wanting to claim to respect privacy and wanting to claim to be fighting child porn - while not actually caring about the reality of either.

    • >> Apple has said that it has been looking at the CSAM problem for some time, and was trying to figure out a privacy-protecting way to detect it.

      > The problem is that that is completely nonsensical, which must have been apparent to them all along. You can't have a "'privacy protecting" way to scan everyone's files.

      It does seem just about impossible. It reminds me of a problem we had at work a couple months ago. Regulations require that we don't give developers access to a certain database. But of

      • by skegg ( 666571 )

        ASCII art ?

      • Wait until you learn about MIME encoded images

      • by Whibla ( 210729 )

        Discord. Text-only. Can't send CP pics if you can't send pics. :)

        Not to be pedantic, but Discord is definitely not text only. My friends and I regularly post pictures in private channels.

        • Yeah I misspoke when I said Discord. I have several apps installed. On one I was surprised to find I couldn't send pics.

          Anyway, the point being if you think outside the box, things are possible.

      • by mark-t ( 151149 )

        You are probably too busy congratulating what you perceive as your own brilliance to notice this, but you completely moved the goalposts of what the article was actually talking about.

        They were not talking about a general communication platform where a company could otherwise simply drop the ability for its users to communicate with photos and users could continue to use text, they were talking rather specifically about about photos stored on the cloud. Leaving aside the issue that text can be used to

      • Your aware that any binary data can be encoded in text to send on platforms that don't support non text links. The alt.binary.* groups on Usenet have been doing this en mass for decades. So no not having pics would be no impediment.
  • Yay! (Score:5, Funny)

    by fahrbot-bot ( 874524 ) on Friday August 20, 2021 @10:09PM (#61713617)

    "we [Apple] are the greatest platform for distributing child porn."

    Apple's #1 ... Apple's #1 ... -- Oh, wait.

  • Have a look at this video discussing the privacy implications of the CSAM scanning.
    https://www.youtube.com/watch?... [youtube.com]

  • by couchslug ( 175151 ) on Friday August 20, 2021 @10:56PM (#61713709)

    If you want privacy you do not look for someone else to give it to you, you do not WANT someone else to provide it, and you use hardware and software which protect your privacy.

    You cannot have internet privacy. You can want it but that's foolish. You can have very limited privacy against private actors at best.

    MSFT and Apple are not your friends. If you view any business as a friend that speaks poorly for your credulity unless you own and run the business as a one-person shop!

    Stallman was right as usual. Read what he wrote and if any questions, read it again. When Slashdot was still a tech site this was better understood.

    https://www.gnu.org/philosophy... [gnu.org]

    "Keep control of your computing, so it doesn't control you!
    by Richard Stallman
    First published in Der Spiegel Online

    The World Wide Web, developed by Tim Berners-Lee in 1990 as a system for publishing and viewing information, is slowly being transformed into a system of remote computing. It will store your data, and data about you, often limiting your access to it but allowing FBI access at any time. It will do your computing for you, but you cannot control what it does. It provides various tempting attractions, but you must resist them.

    In the 1980s, most people did not use computers; those who did, mostly used personal computers or timesharing services. Both allowed you to install software of your choice. Both allowed you full control over your data, though it is not clear what access the timesharing services gave to the FBI. In any case, the timesharing services mostly faded away by the 90s.

    This does not mean that these users had control of their computing. With software, either the users control the program (free software) or the program controls the users (proprietary or nonfree software). Those users were running proprietary software because that's all there was at the time. The users could not change it, or even tell what it really did.

    The abusiveness of proprietary software has intensified since then; nowadays, it is likely to spy on you, intentionally restrict you, and/or have back doors. (Windows is known to do all three; likewise the iPhone and the Kindle.) But even absent such abuse, it wasn't right for users to be controlled by their software.

    That's why I launched the free software movement in 1983. We decided to develop an operating system and applications that would be entirely free (libre, freie), so that the users would have control over them. I gave this system the name GNU. (You have probably heard people call it âoeLinuxâ, but that's an error.) People who switch to this system, and insist on using only free software, are in a position to control their computing. We have liberated only a small part of cyberspace, as yet, but that is a foothold for freedom.

    Developments in the Web threaten to negate this achievement. The first problem was the use of invisible references to sites whose mission was surveillance (perhaps for advertising). Users who visited sites A, B, X and Z did not realize that those pages contained invisible references to iamwatchingyou.com, so each visit informed that site too, and it recorded permanently that this user had visited certain pages.

    JavaScript created a further problem. Initially used for harmless things such as unusual-looking menus, its capabilities have been extended to the point where it can do nontrivial computing. Services such as Google Docs install large JavaScript programs into the user's browser. Even though they run in your computer, you have no control over what they do there.

    Then there is the issue of storing your data in companies' servers. The largest such companies have little respect for users' privacy. For instance, if you hand your data to Facebook, companies pay Facebook (not you) for the use of it. They pay Facebook (not you) to run ads using your face.

    The timesharing companies of the 1980s had usually treat

  • by bubblyceiling ( 7940768 ) on Friday August 20, 2021 @10:56PM (#61713711)
    With the potential for CSAM abuse being high, all Apple needs to do is some safeguards in, such as -

    1) Public process for adding in or removing hashes. This will make it a lot harder for it be abused
    2) Appeal process for false positives, so that people donâ(TM)t face undue legal consequences
    3) Compensation for any damages due to false positives

    I think that should cover it
    • 1) You can’t. It’s illegal to host the images used to generate the hashes unless you’re legally exempted. Only a few organizations are, and Apple is apparently requiring that the hashes must match as they come from two such organizations that are independent of each other, making it much harder for someone to slip something in.

      2) False positives never make it to the authorities in the first place, so there’s no need for an appeal. Once 30ish positives are reported, Apple will have en

      • A quibble on point number 3: the false positive rate for AN ACCOUNT is one-in-a-trillion. It may be that you have a few false positives, but unless you reach the threshold required to reconstruct the key, nobody will ever be able to tell or see what they are.

        • Fair point. Truly so. And I realize as well after hearing some more on the subject that that’s on a per year basis, so it’s even worse.

          Even so, with over a billion Apple devices in use, let’s assume for the sake of argument that’s a billion accounts as well. With that, we can reasonably expect that it’ll take hundreds of years for the first account to reach the manual review step of this process on the basis of false positives. So, for practical purposes, it’s on the orde

      • "2) False positives never make it to the authorities in the first place, so there’s no need for an appeal. Once 30ish positives are reported,"

        Swatting.

        After the first few dozen of these events, people are going to start thinking about their iPhone as a hand grenade under remote control.

  • Makes you feel kind of dirty owning an Apple device.
  • I have a question for everyone here. I need ideas. I need ideas that can't work, even.

    We have an impossible problem here.
    You can't both respect privacy AND stop child porn.
    Not with any technology we have today.

    Privacy is very important. Actually a good portion of my work day for the last 25 years has been devoted to keeping private things private.

    ALSO now that I've learned more about child sexual abused and child porn economy, I'm quite disturbed by the situation. I'm not happy with the solution Apple is lo

    • 1. Train a neural net to recognize child porn

      Allowing me to send adversarial images to you that to your eyes look like the image you expected, but to that AI looks like 5 savages doing a rape train on a little girl.

      Here come the secret warrants. They are for your data. Theres an excuse to do this now.

    • by Reeses ( 5069 )

      We have an impossible problem here.
      You can't both respect privacy AND stop child porn.
      Not with any technology we have today.

      You mean:

      We have an impossible problem here.
      You can't put a high megapixel camera in everyone's pocket and both respect privacy AND stop child porn.
      Not with any technology we have today.

    • The criminal's rebuttal:
      Step 1: Buy a dumb camera

  • Big difference between "child porn" and horny teenagers sending dickpics to eachother.
    I mean, maybe not in the eyes of mentally ill lawmakers, but I think every sane person will agree there's a huge difference.
    • A teenagers dick pick is no more likely to trigger a hash collision than an adult dick pick or a normal photograph.

      The NN isn't trained to identify CP, it's trained to ignore image modification in a similarity search.

  • How would he know unless he's seen it? If he's seen it, then by law he should be arrested.
  • The future is now
  • That was actually my first question when I read that line: How do you know that you are?

    Could it be that the privacy Apple is peddling is very imaginary?

  • Has he searched all other platforms for child porn. To a point where he can accurately measure this?

    Or should we worry about how well he knows here child porn is to be found? ;)

    • Presumably there is a little back and forth between NCMEC/FBI and Apple. They have case handlers and personal connections, that's exactly the kind of information which is likely to leak.

      These programs have obviously been running for a long time and the pretence that checking hashes is just starting now is a sham perpetrated by Apple.

  • Odd choice of words, given the subject. Maybe Apple needs to have someone other than their ad department making public statements.

  • This just illustrates that Apple is doubling down on their CSAM device scanning and could even expand it to files not being uploaded to iCloud.

    Apple's management is stuck and can't move forward or backwards. Customers will leave the platform and all their investments in becoming the "privacy minded" smartphone platform have become worthless. From now on customers will distrust Apple and start looking for alternatives.

    I suspect in the near future we'll see more and more Apple executives "explaining" th
  • How could Apple have known this if it wasn't scanning iCloud accounts

    Simple: Meta-analysis of records from arrests and trials listing Apple products as sources of evidentiary findings.

With your bare hands?!?

Working...