Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Technology

Wickr, Amazon's Encrypted Chat App, Has a Child Sex Abuse Problem (nbcnews.com) 150

Wickr Me, an encrypted messaging app owned by Amazon Web Services, has become a go-to destination for people to exchange images of child sexual abuse, according to court documents, online communities, law enforcement and anti-exploitation activists. From a report: It's not the only tech platform that needs to crack down on such illegal content, according to data gathered by the National Center for Missing & Exploited Children, or NCMEC. But Amazon is doing comparatively little to proactively address the problem, experts and law enforcement officials say, attracting people who want to trade such material because there is less risk of detection than in the brighter corners of the internet.

NBC News reviewed court documents from 72 state and federal child sexual abuse or child pornography prosecutions where the defendant allegedly used Wickr (as it's commonly known) from the last five years in the United States, United Kingdom and Australia, using a combination of private and public legal and news databases and search engines. Nearly every prosecution reviewed has resulted in a conviction aside from those still being adjudicated. Almost none of the criminal complaints reviewed note cooperation from Wickr itself at the time of filing, aside from limited instances where Wickr was legally compelled to provide information via a search warrant. Over 25 percent of the prosecutions stemmed from undercover operations conducted by law enforcement on Wickr and other tech platforms. These court cases only represent a small fraction of the problem, according to two law enforcement officers involved in investigating child exploitation cases, two experts studying child exploitation and two people who have seen firsthand how individuals frequently use Wickr and other platforms for criminal transactions on the dark web.

This discussion has been archived. No new comments can be posted.

Wickr, Amazon's Encrypted Chat App, Has a Child Sex Abuse Problem

Comments Filter:
  • Wow (Score:5, Insightful)

    by rpnx ( 8338853 ) on Friday June 10, 2022 @02:04PM (#62610134)
    72 cases, this is clearly a rampant problem and this isn't just a convenient excuse to take away privacy protections. /sarcasm Fuck the government.
    • Re:Wow (Score:5, Insightful)

      by jwhyche ( 6192 ) on Friday June 10, 2022 @02:08PM (#62610138) Homepage

      Oh bullshit. Every time one of these services comes along someone finds some child abuse problem on it. What about the big one that has been proven, Facebook? Why don't they ever go after this one?

      • Re:Wow (Score:5, Insightful)

        by dfghjk ( 711126 ) on Friday June 10, 2022 @02:15PM (#62610160)

        or the Catholic Church?

        • Re:Wow (Score:4, Insightful)

          by dryeo ( 100693 ) on Friday June 10, 2022 @10:25PM (#62610792)

          Well, now that they have control of the American Supreme Court, maybe they'll strike down child abuse laws. Nothing in the Constitution about it and when the Constitution was written, you could just buy a child to abuse

          • Scary but plausible. Time and time again, it seems to be people on the far right who love to tell everybody else what they're doing wrong who wind up getting caught molesting children. There was a Republican Speaker of the House who even wound up going to jail for his long, distinguished career of raping children.

        • The Southern Baptists, also.

      • because FaceBook and their AI and their datasets know where the bodies are buried. And US politicians have a great many skeletons in the closet.
      • What about the big one that has been proven, Facebook? Why don't they ever go after this one?

        Because Facebook actually has a whole department dedicated to scanning this content and reporting it. They did go after it and with the exception of WhatsApp's encrypted messages which so far has resisted the "scan all content against a database" approach (that we know of) they have already been forced into submission.

        Just because you don't see something, doesn't mean it isn't happening.

        But as a complete aside here's a cautionary tale: Despite an internal oversight department looking at all your shit the ma

    • by Anonymous Coward

      Unpopular opinion: I don't think the materials themselves need to be illegal: They're already proof of abuse and that ought to be cracked down upon like a ton of bricks.

      Aside: And then people so inclined need help... a buddy to keep an eye on them and keep their minds off of kiddie-fiddling. Much more effective than throwing them in jail. Preferrably assigned before their first misstep, like on their own request, when they feel the urges and know they're wrong, but can't help themselves on their own. Not e

      • There's reasonable grounds to prohibit distribution, since a commercial market that's now largely a myth save for a few isolated cases that get shut down fast would become a reality and encourage it, but simple possession... I'd agree on the basis that while there's not a whole lot of evidence on the issue, there's good reasons to believe it provides an outlet for urges that makes someone less likely to offend, and that this effect is greater than the contention that it makes the behavior seem more acceptab
      • by fermion ( 181285 )
        Because such images are illegal it gives law enforcement probable cause. There is often no other way to detect abuse because the abusers are protected and are only caught over time.

        For instance, a son of a pastor was a known child molester but the pastor thought exposure therapy was the best solution so made the son a youth pastor. In another case, a youth minister was charged with dozens of count of sexual assault, but went to another state where he was hired a youth minister, the hire church saying he w

      • by AmiMoJo ( 196126 )

        If you ask the victims, some of them will tell you that every time one perv jacks off to a photo of them, that's continuing the abuse. Knowing it's out there and being traded must be horrific.

        There is also the argument that trading creates a market for it. I expect some of the people using Wickr are making the stuff, either encouraged by others or selling it to them.

        • Do adult rape victims not feel the same way? What about children that are beaten, tortured, or mutilated, but not in a sexual way or with their genitals visible? You can't even ask the ones that are killed, but I doubt they'd be happy.
          Yet media of all of those are freely available and not subject to laws banning them (and yes there's sick fucks into that). You know how most victims find out someone had their pics? Law enforcement tells them. And in a not insignificant number of cases, the images were like
          • by AmiMoJo ( 196126 )

            Yes, adult rape victims often feel the same way if their assault was filmed and monetized. See the recent "Girls Do Porn" case as an example, it wasn't just the initial abuse but the on-going distribution of the videos.

            Same thing with victims of terrorism, beheading videos, live streams of mass murders etc. I don't know specifically about US law, but in Europe the right to dignity and privacy extends beyond death, and to the families of the victims.

        • If you ask the victims, some of them will tell you that every time one perv jacks off to a photo of them, that's continuing the abuse. Knowing it's out there and being traded must be horrific.

          This logic is horrifically bad.
          If we outlawed everything that triggered a victim of a crime...

          Note: I'm not sure where I stand on CP, really. Obviously it comes from bad shit happening. That bad shit needs to fucking stop. People who do it need to be stopped.
          But I'm not sure that busting people who merely have their hands on the ill-gotten gains of the bad person helps fucking anything. You can argue that locking them up gets rid of potential physical offenders- but that's pretty Minority Report. You ca

    • Re:Wow (Score:5, Informative)

      by ranton ( 36917 ) on Friday June 10, 2022 @03:27PM (#62610308)

      The summary is crap, but if you read the article you would see there does seem to be a significant lack of monitoring on the Wickr platform. Facebook made almost 27 million reports of child sexual abuse material in 2021. Wickr made 15. I couldn't find statistics on usage of Wickr compared to Facebook, but I don't think Facebook has two million times more usage.

      And the methods don't require backdoors for encrypted data, just monitoring it before you encrypt it. That is what other platforms are doing.

      When even Reddit users have started banning posting Wickr handles because of the rampant use of the platform to share child sexual abuse material, there is likely a big problem.

      • by rpnx ( 8338853 )
        Oh cool, a platform that doesn't spy on the users. Good advertisement strategy. Obviously an encrypted service is going to have less reports than facebook which reads all your messages. Spying on messages before they are encrypted defeats the point and is a 4th and 5th amendment violation.
        • by ranton ( 36917 )

          Spying on messages before they are encrypted [...] is a 4th and 5th amendment violation.

          The 4th and 5th amendment do not protect you from private companies reading texts you send to their platform.

          Spying on messages before they are encrypted defeats the point.

          Depends on the point of the encryption. If the point is to ensure no one can identify illegal behavior, then yes it does defeat the point. If the point is to make sure no one can read your messages in transit or if someone hacks Wickr's servers, there are methods to do this which allow Wickr to still monitor for illegal behavior.

          • The 4th and 5th amendment do not protect you from private companies reading texts you send to their platform.

            No, but the government acting on it being handed over should really be considered a violation.

            • by ranton ( 36917 )

              The 4th and 5th amendment do not protect you from private companies reading texts you send to their platform.

              No, but the government acting on it being handed over should really be considered a violation.

              A valid search warrant is all you need to provide governments the ability to gain access to and decrypt your data while adhering to the 4th amendment.

              The 5th amendment only applies if the governments needs to compel the user to provide a password or decryption key. This is why end-to-end encryption is so contentious, because it requires the government to obtain these keys from the consumer. With standard encryption, they just need to provide a search warrant to the service provider, bypassing any 5th amendm

              • With it being a "private" key, I could argue that the key belongs to me and that it should have 5th amendment protections against them using a third party to retrieve it.

                • by ranton ( 36917 )

                  With it being a "private" key, I could argue that the key belongs to me and that it should have 5th amendment protections against them using a third party to retrieve it.

                  Then you haven't read much case law on this topic.

                  • by rpnx ( 8338853 )
                    Yup if you hand the key to a third party only the 4th amendment applies. Keys on your device are also protected by the 5th Amendment (depending on where you live) which doesn't have a warrant escape hatch.
                    • by rpnx ( 8338853 )
                      (supposing you encrypt the disk or key with a passphrase, obviously, if you store the key with no passphrase you're out of luck)
            • No, but the government acting on it being handed over should really be considered a violation.

              Facebook is voluntarily reporting the content. I can tell the government absolutely anything I want about you. That's not a 4th or 5th violation, that just me being a dick of a friend who you shouldn't have trusted with your information.

      • Do you really feel a pressing need to be monitored on everything you do on the internet?
      • The thing which iMessage doesn't do either? Or WhatsApp?

        • by ranton ( 36917 )

          The thing which iMessage doesn't do either? Or WhatsApp?

          According to a Meta spokesperson, WhatsApp has implemented numerous features to proactively detect and prevent the spread of child exploitation material, including limiting how many people can be shared on a viral image at one time and using photo-matching technology on rules-violation reports submitted to the company by users and non-encrypted photos found in profile or group avatars.

          • Doubling down huh. Are you astroturfing for an Amazon competitor or a government agency?

            "just monitoring it before you encrypt it" is clearly not the same as limiting the recipient count, monitoring reports and scanning profile pics.

            • by ranton ( 36917 )

              You missed the part about using photo-matching.

              I actually don't care much either way on the E2EE encryption debate, although I guess I lean a little on the side of banning it or heavily regulating its use for consumers. But your statements that WhatsApp isn't doing any more than Wickr when you can see that isn't true by just reading the article is just lazy.

              • Tripling down.

                Reports are unencrypted messages send back to Meta by users, so they are not "just monitoring it before you encrypt it".

              • It's going to be a little hard to ban E2EE as long as there is

                pip install pycrypto

                and python programming tutorials for dummies.
                • by ranton ( 36917 )

                  Making something illegal doesn't make it impossible. Murder is illegal but people still find ways to do it.

                  For me, it all comes down to how much the illegal behavior would actually be reduced if E2EE was banned. Sophisticated criminal organizations would just build their own tools to communicate. But your average pedophile wouldn't be able to do this. So how much would the activity really be reduced? My guess is it would still be reduced significantly, but that is a very unresearched opinion.

                  • The thing I don't like about the concept of ban is that it also bans all legitimate uses of end-to-end encryption, and that is overkill.
                    It's kind of like banning baseball bats or hammers because they can be used as deadly weapons.

                    So if I want to make my own P2P and encrypted file sharing and storage application to store a family's lifetime or genealogy of photos and videos, or to store your medical records, for example, that becomes illegal?

                    That doesn't work for me.
                    May as well just ban "if then else" if put
      • Are you saying that platforms promising end-to-end encryption should first upload your unencrypted message to some server? WTF? China and other authoritarian regimes love you. The rest of us...not so much. Seriously, "it's for the children" is not an excuse to violate everyone privacy.
        • by ranton ( 36917 )

          Are you saying that platforms promising end-to-end encryption should first upload your unencrypted message to some server? WTF? China and other authoritarian regimes love you.

          You post as if the use of client-side scanning and even banning end-to-end encryption (E2EE) for general consumer use aren't highly debated topics right now. By just about every western government. You would have to include all governments in your list of authoritarian regimes if you think fighting against E2EE is an authoritarian view.

          • You would have to include all governments in your list of authoritarian regimes if you think fighting against E2EE is an authoritarian view.

            Authoritarianism is not a boolean, it is a spectrum, like everything else we talk about in politics. Opposing E2EE is absolutely an authoritarian view. There is really no question about that.

      • Cite for your number? I tried to look it up and found an even larger one: 55.6million by Q3 2021. https://www.comparitech.com/bl... [comparitech.com]

        The numbers around 20-30million seem to be talking about 2020.

  • In other words (Score:5, Insightful)

    by Opportunist ( 166417 ) on Friday June 10, 2022 @02:08PM (#62610136)

    It actually protects the privacy of the people using it.

    Yes, every tool like that will be abused. The question is what is more important to you.

    • Think of the children's privacy!

      • If you think of the children all the time, chances are good you're a pedo.

        • No, that would be "Think of the children's privates!" Those few letters make a *really* big difference.

    • Yup.

      Plus, it's not as if getting rid of the protections on Wickr will do anything to protect children. The monsters that do that will simply shift to another service or stand up their own encrypted jabber servers ( indeed, I'd be surprised if that's not already happening ).

    • Re:In other words (Score:4, Interesting)

      by JaredOfEuropa ( 526365 ) on Friday June 10, 2022 @02:57PM (#62610272) Journal
      Kiddie porn is the Godwin of discussions on the right to privacy.
      • Not just privacy... Screeching out a "Won't somebody please think of the children?" or any similar equivalent is a skeezy way that people who can't actually argue and defend their point use to try to summarily declare victory and shut down any criticism or discussion on plenty of other topics all across the internet. It's definitely a tipoff to me that the poster is, at best, incapable of defending their position or, more likely, simply not operating in good faith at all; and I should discard them and thei

      • I'd rather side with a pedo than certain governments. Simply out of self interest. I'm too old to get fucked by a pedo, but you're never too old to get fucked by the government.

  • by kyoko21 ( 198413 ) on Friday June 10, 2022 @02:40PM (#62610226)

    What happened to the days when people just get porn through PGP and alt.binaries? These days everyone want things at an instance and don't want to even work for it anymore.... lazy. I'm not condoning the act, but at least I can respect them from a technical perspective that they actually put in the work as oppose to just clicking on the "easy button" to sin.

  • by burni2 ( 1643061 ) on Friday June 10, 2022 @02:44PM (#62610232)

    Pretty strange for a problem.

  • by jopet ( 538074 ) on Friday June 10, 2022 @02:49PM (#62610248) Journal

    You can send usb sticks with images with those services without them doing anything! How has this gone undetected for such a long time!

    Also, dark alleys have a Child Sex Abuse Problem: you can meet there and exchange USB sticsk or other media without anyone routinely checking what is going on!

  • Any truly encrypted messaging system would seem to be a "problem'. People talking to each other face to face can discuss all manner of things without being tracked. The anonymous, peer-to-peer payment system known as "cash" is being used every single day to transact in illicit goods.

    A messaging app without the ability for attachments might skirt the association with trafficking in child porn. Then again maybe you could uuencode to send images.

    If I were the paranoid sort I'd think of these pronouncements as meaning either:
    The system in question isn't actually as private as described, so the pronouncements are intended to herd people towards apps that aren't so secure so they're easier to catch
    or
    The system is pretty darn secure, so either the reporting of child porn, or the actual sending of child porn is done in an attempt to force the system into being less secure.

    On a side note. Systems such as PhotoDNA would seem to encourage the creation of *new* images that haven't been fingerprinted. Wouldn't newly created images result in more child abuse than images from 30 years ago?
    • by rpnx ( 8338853 )
      That's actually a pretty compelling argument against automated scanning. If a human looks, then they can tell, but a machine can't. It seems like all CP regulations are designed to protect abusers and encourage child abuse, like making it illegal to have evidence of the crime. I would say this was intentional, but it's probably just stupid.
      • by gweihir ( 88907 )

        That's actually a pretty compelling argument against automated scanning. If a human looks, then they can tell, but a machine can't. It seems like all CP regulations are designed to protect abusers and encourage child abuse, like making it illegal to have evidence of the crime. I would say this was intentional, but it's probably just stupid.

        Usually, I would agree. But given that the "War on Privacy" is a really old one, I think this is completely intentional and the people behind it know perfectly well what they are doing.

    • by gweihir ( 88907 )

      Indeed, it is. The whole thing is really easy to understand once you realize that the ones pushing here have zero interest in protecting any children. It is all about preventing people from having a private means to exchange ideas, to organize or to simply rant with the government not being able to listen in.

      On a side note. Systems such as PhotoDNA would seem to encourage the creation of *new* images that haven't been fingerprinted. Wouldn't newly created images result in more child abuse than images from 30 years ago?

      Not necessarily if you assume that most child abuse does not get recorded, which is reasonable. But the narrative pushed essentially says that the pictures are the problem and the actual abuse is treate

  • Almost none of the criminal complaints reviewed note cooperation from Wickr itself at the time of filing, aside from limited instances where Wickr was legally compelled to provide information via a search warrant.

    That is stated as a complaint, but that is exactly how it is supposed to work. I don't want companies handing over private info "just because". This article also reeks of "think of the children". If you have genuine end-to-end encryption, the service cannot know what is being transmitted. That is also how it is supposed to work. Law enforcement has to do some old fashioned legwork. Tragic. /s

  • by Pinky's Brain ( 1158667 ) on Friday June 10, 2022 @04:43PM (#62610416)

    I have a hard time believing all the bizarre arguments used against Wickr are honest. Like above in Ranton's post, or the NCMEC vice president talking about Meta scanning Facebook somehow being a proactive way to prevent Whatsapp from being used for child abuse images (Facebook is clearly orthogonal to Whatsapp).

    So lets examine what makes Wickr really special ... no phone number meta data to track.

    Signal keeps phone numbers associated with accounts because they know it keeps them somewhat useful to government, Wickr betrayed government and gave people (among which the wrong people) a little too much privacy. Of course government can't come out and say that, because they don't want Signal and Telegram to change, so you get these silly arguments.

  • ... Wickr was legally compelled ...

    Translation: Somebody did something I don't like, and a third-party only obeyed the law: How dare they be less outraged than me!

    I'm guessing these 'world owes me' types are the same people who talk about corporate "responsibility to the shareholders". They want 'bad stuff' to stop happening but won't demand their messages (or business entity) be inspected by the government for legality and wokeness.

  • by GotNoRice ( 7207988 ) on Friday June 10, 2022 @06:18PM (#62610574)
    This is a simple question about privacy. Should people be allowed to have private conversations on the internet? Yes or No. Once you create exceptions to privacy and/or expect services to monitor those conversations for potential illegal activity, they aren't really private anymore. "Think of the children" doesn't change the basic issue at stake here. Today, it's about "the children". In the future, it could be about anything. If Roe vs Wade gets struck down, for example, then are people going to get reported for talking about their illegal abortions on Wickr? Are you going to get reported for smoking a joint if you are in a state where it's still illegal? If you are a Chinese national, are you going to get reported for talking about Taiwan?
    • by gweihir ( 88907 )

      Indeed. But the enemies of privacy and individual freedoms cannot let it be that simple, because then it becomes obvious that they are the problem here. Hence they obfuscate the question and make it about something else and then claim that this other thing _must_ be solved at all cost.

      It is simply lying by misdirection about their true goals and sabotaging any discussion that could expose them. Because anybody that is not OK with fighting child abuse (pictures) in every way possible must be a child abuser o

  • Ford, maker of vehicles that are used in bank robberies and get-aways, has a crime issue. /sarcasm
    phuck this article. bad people are going to do bad things. That doesn't mean you turn society into a facist, sureveilence state... oops, too late.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...