Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
China Privacy Technology

Chinese Deepfake App Zao Sparks Privacy Row After Going Viral (theguardian.com) 32

A Chinese app that lets users convincingly swap their faces with film or TV characters has rapidly become one of the country's most downloaded apps, triggering a privacy row. From a report: Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has already prompted concerns elsewhere over potential misuse. Users provide a series of selfies in which they blink, move their mouths and make facial expressions, which the app uses to realistically morph the person's animated likeness on to movies, TV shows or other content.

The company was forced to issue a statement on Sunday pledging changes after critics attacked the app's privacy policy, which it had "free, irrevocable, permanent, transferable, and relicenseable" rights to all user-generated content. There has been growing concern over deepfakes, which use artificial intelligence to appear genuine. Critics say the technology can be used to create bogus videos to manipulate elections, defame someone, or potentially cause unrest by spreading misinformation on a massive scale. "We understand the concerns about privacy. We've received the feedback, and will fix the issues that we didn't take into consideration, which will take some time," a statement released by Zao said. Zao is owned by Momo Inc, a Tinder-like dating service that is listed on the US Nasdaq.

This discussion has been archived. No new comments can be posted.

Chinese Deepfake App Zao Sparks Privacy Row After Going Viral

Comments Filter:
  • by nehumanuscrede ( 624750 ) on Monday September 02, 2019 @03:26PM (#59149902)

    Sounds like a FANTASTIC way to populate a facial recognition database to me.

    Even more so when your targets voluntarily provide you with all the data you could ever want.

    • Exactly. Isn't that app that showed the "old" you the same?
      • Exactly. Isn't that app that showed the "old" you the same?

        But that was for the Russians. Now our Chinese friends are asking for our help.

    • Snapchat was cheaper to develop and operate, and equally good for populating a face database.
  • Narcissists everywhere will be eating this up.

  • by sheramil ( 921315 ) on Monday September 02, 2019 @05:11PM (#59150166)

    ... someone uses it to create a fake of Premier Li Keqiang. at which point the mass arrests will begin and the organ farms will be at capacity.

  • by meerling ( 1487879 ) on Monday September 02, 2019 @05:12PM (#59150172)
    Sometimes the malware is in the license agreement, and not the software.
    • Imagine someone respecting an agreement from a Chinese software company claiming ownership of derivative products. Not only do they feel entitled to other nation's IP, but now they want to own the user? It's as if Adobe started asking for payments on work created in Photoshop.

      This technology as bread from the efforts of China to do facial recognition and oppress its people. They weed out dissent and non-compliance.

      Imagine respecting an agreement like this -- where a company expects to own you because you re

  • just my $.02 (Score:4, Interesting)

    by PinkyGigglebrain ( 730753 ) on Monday September 02, 2019 @06:15PM (#59150328)

    I think the worst thing about these "deep fakes" is that if some one posted REAL images of some public figure doing something really horrible, like say raping a 9 year old, no one would believe it. And the there would always be "reasonable doubt" if it ever even got to trial..

    • by AmiMoJo ( 196126 )

      This might actually be a good thing. We don't want to be overly reliant on video evidence. It should come under scrutiny and require there to be additional supporting evidence.

      • There's no problem relying on video evidence if it's reliable evidence. That is the whole issue here, that video evidence will no longer be reliable. Taking that tool away will not help get anyone closer to the truth. (What it will do is put conspiracy theorists into a frenzy.)

        In case you don't know how things actually happen in court: 9 times out of 10 the only non-circumstantial evidence that exists, besides video, is personal testimony. Which is, to put it lightly, of variable quality.

    • Just in time to save someone from that Pee tape.

    • If a video was ever enough in the past, it should not have been - especially for public figures who might be targeted by powerful people with a lot of money. Since near the dawn of video, it has been within the reach of people with money to hire actors that are close look-alikes and use makeup and the fuzziness of something made to look like a surveillance video to create a more convincing video than most of these deep-fakes. Nation-state propaganda machines have certainly been doing it for decades.

      I think

  • by Jarik C-Bol ( 894741 ) on Monday September 02, 2019 @06:21PM (#59150338)
    So far, every deepfake video I’ve seen has been deeply entrenched in the uncanny valley, setting off all the lizard brain alarm bells that something is fundamentally wrong with the image in question. Maybe I’ve only been shown mediocre examples of the tech, but as far as i’ve seen, it pales in comparison to the hype surrounding it .
    • It would be nice if there was a web site or even a youtube video with examples of all the various deepfake software that's been out.

      The ones I've seen have all been bad porn ones and while they don't look "real", they do look like they're getting pretty close.

    • by AmiMoJo ( 196126 )

      This app is quite remarkably good and fast. Not perfect but considering that we went from it taking a desktop computer days of training a neural network on dozens of images and then a few days to process short video clips to having your phone do it in under 10 seconds, and all in the space of a year, it's remarkable.

      The really interesting thing here is that it's improving so fast. It's going to be real-time and perfect very soon.

    • by Kjella ( 173770 )

      That is only going to be a matter of time. I've been experiementing a little recently with StyleGAN [github.com] and it does an incredibly good job at "decomposing" a face into plausible artifical faces, or well they're not CGI but like combining macro/micro properties of different faces. Then you have systems now that instead of doing the single transformation matrix deepfake that you've seen now start to reverse engineer StyleGAN parameters to match actual faces. Once you have this you can start controlling them more

  • Computer created actors are expensive and require great skills. John Candy died in the middle of a movie. Computers were used to generate his image and actions. You could not spot the false John Candy. This new software could make it super easy and we could have new Elvis movies endlessly or John Wayne movies or whatever. Clint Eastwood is departing this world and many people would love new Clint Eastwood movies.
  • While I do not consider myself to be a conspiracy theory person, it does seem as though there is a small subset of the population who makes a living out of being "outraged" by this or that. Then in turn, the media makes money by happily headlining this fake outrage.
    I never got why anyone gives a shit about what the professionally outraged say.
    Like that WWI FPS game who included women and minorities in high numbers to appease the professionally outraged. I think it was BF 5, but I cannot recall exactly.
    So, r

    • You will need to explain how conspiracy theorists are equivalent to "the outraged." They're disconnected from reality, but for the most part I've seen they have a calm demeanor:

      The frizzy-haired "Ancient Aliens" guy seemed to get excited, but never outraged.

      The guys talking about "Illuminati" want to sound enlightened, but they don't sound particularly emotional.

      Moon landing deniers for the most part sound like history nerds discussing history. (Without actually knowing the history or understanding the scie

  • Critics say the technology can be used to create bogus videos to manipulate elections, defame someone, or potentially cause unrest by spreading misinformation on a massive scale.

    We've always been able to fake evidence, manipulate elections, defame, cause unrest by spreading misinformation, etc. It's not a new problem. It's probably as old as people have lived in community.

    The only change is that there is an easy to use tool for doing it so the capability is no longer limited by cost and expertise required to those who are the most powerful. This threatens them by the loss of the capability. I say the "loss" of the capability because the eventual result should be that everyone simpl

  • Getting on the top of iOS App Store within 48 hours in China is pretty impressive. Can you imagine how wasn't your privacy can be screwed? buy essays cheap [buyessays.cheap]

BLISS is ignorance.

Working...