Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Privacy Security

Bot Generated Fake Nudes of Over 100,000 Women Without Their Knowledge, Says Report (forbes.com) 57

An anonymous reader quotes a report from Forbes: Around 104,852 women had their photos uploaded to a bot, on the WhatsApp-like text messaging app Telegram, which were then used to generate computer-generated fake nudes of them without their knowledge or consent, researchers revealed on Tuesday. These so-called "deepfake" images were created by an ecosystem of bots on the messaging app Telegram that could generate fake nudes on request, according to a report released by Sensity, an intelligence firm that specializes in deepfakes.

The report found that users interacting with these bots were mainly creating fake nudes of women they know from images taken from social media, which is then shared and traded on other Telegram channels. The Telegram channels the researchers examined were made up of 101,080 members worldwide, with 70% coming from Russia and other eastern European countries. A small number of individuals targeted by the bot appear to be underage. According to the report, the bots received significant advertising on the Russian social media website VK. However, the Russian social platform's press team told Forbes that these communities or links were not promoted using VK's advertising tools, adding "VK doesn't tolerate such content or links... and blocks communities that distribute them."

This discussion has been archived. No new comments can be posted.

Bot Generated Fake Nudes of Over 100,000 Women Without Their Knowledge, Says Report

Comments Filter:
  • Why am I not sexualized?

    I demand fake nudes of me too!

    Waaaa!

    • Because most foot fetishists prefer feet in sexy shoes.

    • by AmiMoJo ( 196126 )

      Post a link to your onlyfans or stop complaining.

    • by fazig ( 2909523 )
      For that to happen there must be someone who likes you or hates you enough in the first place.
    • Why am I not sexualized?

      That's a good question.

      Can't we end this issue once and for all via. a government program to create a fake nude of every woman in the country and post them on a web site. Give it a couple of years nobody will care and revenge porn will be a thing of the past.

  • Can anyone who has used this confirm if this is just basic face swapping tech?
    • by Z80a ( 971949 )

      It's probably a bit more advanced than that, but it's still "Neural netwok invent stuff"

      • I’ve seen a lot of not very advanced work around AI, neural network, and deep whatever. Often times it’s not complicated at all it just takes a large amount of processing power that we are now on the cusp of having even in every phone. Face swapping or face filtering in particular seems like something that has in the last few years become normal and available even on the cheapest phones and numerous websites. The terminology of the article in this case even seems improperly used. I’ve neve
      • I bet it's not much more than face-swapping. I mean, how are you going to tell whether it's a legitimately faked nude or just some random body cut&pasted below the head? Even more so since a cut&paste takes little effort while a deepfakes-style fake takes considerable effort. So it's presumably a bunch of random bodies with real heads whose actual bodies are nothing like the fakes shopped onto them.

        Which leads to a second issue, 99% of the population aren't Iinstagram-model gym rats. Do you rea

    • Yes...we need to "confirm" this. O:-)
      For science!

    • by AmiMoJo ( 196126 ) on Friday October 23, 2020 @09:43AM (#60639590) Homepage Journal

      It removes clothes and replaces them with AI generated bodies. The quality is extremely poor. Either the viewer is desperate or they are more interested in "revenge porn" type scenarios where the goal is to humiliate the victim.

      • I would think if the clothing in the non-nude image is form fitting, the AI generated bodies ought to be pretty good. The corpus of full frontal nudes is almost unlimited, it ought to be not very hard to map that imagery onto discovered clothed body topologies, especially if you have some decent corpus of nude and clothed people. Fortunately, the film and TV industry has generated a pretty big database of people visible clothed and unclothed.

        I would think video deepfaking would be much, much harder.

    • by ceoyoyo ( 59147 )

      It's not face swapping. It's a generative model. It uses a picture as input and generates an image in response. Typically there's some random noise as well, so you could run it many times to get slightly different images.

      Basically it's an automated version of sending the picture to a not very ethical Photoshop artist.

  • Because I could see guys using it to create profile pics... my head on a more attractive body? Yes please

  • If yes, prosecute and convict.

    If you can't and you won't, writing a strongly-worded piece about it ain't gonna help anyone.

    • by AmiMoJo ( 196126 ) on Friday October 23, 2020 @09:47AM (#60639600) Homepage Journal

      It's a bit of a grey area in many jurisdictions. On the one hand many are now criminalizing "up skirt" type shots, but when it comes to fake images victims usually have to rely on harassment laws.

      Because that's mostly what this is about, harassing people with fake nudes or messages about their hate wanking. Sometimes on Twitter you see people posting fake nudes or polaroids covered in seamen.

      • "polaroids covered in seamen"

        Ahoy! I do remember Polaroid instant cameras being large and bulky, but nothing approaching carrier size. Hell, not even pontoon boat size.

        • Ahoy! I do remember Polaroid instant cameras being large and bulky, but nothing approaching carrier size. Hell, not even pontoon boat size.

          There's new polaroid style film cameras out now on the market along with film for them, by Fuji Instax [fujifilm.com]....

          Film hasn't gone away, and, in fact, is making a bit of a comeback...all formats, 35mm, instant, medium format.

          Heck, they are making new film backs for the new Instax film for old film camera like the classic Hasselblad V System cameras, which are still very po

    • Writing a strongly worded letter may be the first step to make it illegal.

    • by HiThere ( 15173 )

      I've no idea whether it *is* illegal, but I think it being more than a civil case is a wrong decision. Doing this kind of thing is going to get easier and easier as time goes by. Financial penalties are a reasonable compromise. Make doing it commercially illegal with an enforceable law, but just doing it, or even doing it and sharing it, should be a civil case. (Commercially should include hiring someone else to do it as well as being paid to do it.)

      Then we come to the problem of proving who did what, w

      • I'm not sure it should be illegal. The is basically just a fancy head swap. Who does it hurt if someone generates a fake nude picture of Michele Obama, Melania Trump, or any other famous person. It's not an actual nude. I can see a time in the near future where you could have a porn movie that lets you swap the main character for any random person you wanted whether it is your wife, a celebrity, or your neighbor's wife. The same could be done with feature length films at some point where you could put

        • I'm not sure it should be illegal. The is basically just a fancy head swap. Who does it hurt if someone generates a fake nude picture of Michele Obama, Melania Trump, or any other famous person. It's not an actual nude. I can see a time in the near future where you could have a porn movie that lets you swap the main character for any random person you wanted whether it is your wife, a celebrity, or your neighbor's wife. The same could be done with feature length films at some point where you could put your face as ironman if you wanted. It's coming and we should probably just accept it and move on with our lives.

          Civil cases places the burden of investigation and initial cost of enforcement on the victim and unless you can show monetary damage you have no case. I think it's reasonable to outlaw this the same as many jurisdictions have rules against revenge porn, and I believe "think of the children" is a more valid concern than famous people.

          • I think it's reasonable to outlaw this the same as many jurisdictions have rules against revenge porn, and I believe "think of the children" is a more valid concern than famous people.

            We have laws against revenge porn because it is an invasion of privacy. There was no invasion of privacy here anymore than if you put your neighbor's head on the body of elmo. This doesn't really fall in the "think of the children" category either. We already have laws restricting real porn from minors. This should fall in the same category. I don't see how it should be treated any different than normal porn or computer generated porn. Copyright laws apply as does possibly using someone's likeness wit

            • I believe distributing someones likeness in a sexual context without their permission should be illegal, separate from copyright.

            • I think it's reasonable to outlaw this the same as many jurisdictions have rules against revenge porn, and I believe "think of the children" is a more valid concern than famous people.

              We have laws against revenge porn because it is an invasion of privacy. There was no invasion of privacy here anymore than if you put your neighbor's head on the body of elmo.

              No invasion of privacy? The woman who is the subject of this is suddenly put in a sexual context against her will and her likeness is used as pornographic material. I would say that if something is done to you, as a person, against your will, that is falls under the definition of invasion of privacy regardless if these pictures are fake or not. From the PoV of the victim it is probably not better that they are faked since she didn't consent to making these pictures and could do nothing to stop it.

              You cannot

        • It's also pretty hard to outlaw someone just drawing a picture of someone well known nude. This is similar to the child porn problem where people create "child porn" from scratch and it doesn't include the implied crime of child abuse, because there is no actual child ever involved.

          At this point you're just quickly moving into thought crime territory, especially when it comes to fake nudity of adults.

          The actual illegality probably comes from misrepresenting the imagery as the person actually nude and aggre

          • It's hardly a new issue. A recall one case where a man cut heads off kids in clothing catalogs, physically pasted them on the adults in a porn mag, and was subsequently convicted and sentenced for production of CP, same as if he had recorded real kids. Which is, IMHO, stupid.
            • A recall one case where a man cut heads off kids in clothing catalogs, physically pasted them on the adults in a porn mag, and was subsequently convicted and sentenced for production of CP, same as if he had recorded real kids.

              Last I heard: Though claiming constitutionality, despite the first amendment, because of the alleged harm to children victimized in the production of the material, the federal child porn laws nevertheless ban possession of pornographic cartoons and other drawings of generic child fig

        • by marcle ( 1575627 )

          That would work fine in a more enlightened society. The problem is that when someone sees a fake nude body with, say, some celeb's picture, they assume it's the actual nude body. There's no easy way for the celeb to refute that -- what do they do, post a pic of their actual bod to prove it? So while it might not be actual revealing of somebody's bod, it still has potential for embarrassment and harm to reputation.

    • by ranton ( 36917 )

      If yes, prosecute and convict. If you can't and you won't, writing a strongly-worded piece about it ain't gonna help anyone.

      Strongly worded pieces are often where laws originate. This is one way politicians know where their constituents stand on issues. There probably will eventually need to be new laws to govern the use of this technology, or at least new case law. This technology is probably not very sophisticated, such as trying to actually match the naked body closely to the body of the actual individual, but it will get much better. Just imagine where the technology is likely to be at 20-30 years from now. Our society needs

      • Strongly worded pieces are often where laws originate.

        No, substantial material interests are often where laws originate. Practically no laws originate in strongly worded pieces without a substantial monetary backing.

  • does anyone has a link to this said bot? just asking so I don't click on it by mistake.
    • by EmoryM ( 2726097 )
      Given the claims you'd think one of the outlets would post something with the face blurred out so we could get some idea of the quality but nothing comes up. Although I do quite enjoy photographs of nude women the internet is brimming with that - I am honestly curious about the quality of the images generated by the bot. Hopefully someone here has the balls to just name of the damn thing instead of treating it like Lord Voldemort.
  • Does the article have a link to these photos? Asking for a friend.

  • any samples available?? (asking for friend)

  • As a European ... (Score:5, Insightful)

    by nospam007 ( 722110 ) * on Friday October 23, 2020 @10:19AM (#60639750)

    ..if you ever went to a nude beach, you don't WANT this app.

  • Pics or it didn't happen.

  • In that particular case, the bot is a service, and the fact that you have to upload a picture and download the result can be attacked if you are not allowed to do so. I'm quite sure that will end up as an argument against end-to-end encryption BTW. Boo, people can send pictures of your kid to a bot and get fake child porn in return and no one will know.

    But things will become interesting when bots like that will be available locally. Take a picture, any picture, and see a nude version, only on your computer.

    • I think you're overestimating the problem of self-satisfaction with this tech, as you say people have been imagining all sorts of crazy things in their own heads for ages and the tech is neutral to what images are mixed and their context. The problem is distribution and if it's a service then that's distribution and I believe few will speak out in favor of an exception for personal usage when up against parents talking about how their kids don't want to go to school because someone is distributing pictures

  • The bots are so good, they start to have fantasies, just like humans...
  • by hawk ( 1151 ) <hawk@eyry.org> on Friday October 23, 2020 @01:26PM (#60640550) Journal

    I am a lawyer, but this isn't legal advice. If you want that, pay my retainer first . . .

    There *are* jurisdictions where the right to control one's likeness is legally protected. Californian comes to mind, with the right even surviving death (yeah, it's a Hollywood law).

    Under such a law, the victim may have a legal right to action if her recognizable faze was distributed.

    hawk, esq

  • So clickbait now includes security companies repurposing software to harvest your email address, which is what you need to offer up to read the actual report.

  • where are they
  • Imagine how much political heat and public discourse would be generated if you took mens faces and pasted them onto ugly fat unflattering bodies with a micro dick and spread them around.

    I would guess suddenly a whole lot of people would have a big problem with the technology and its application.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...