Chinese Deepfake App Zao Sparks Privacy Row After Going Viral (theguardian.com) 32
A Chinese app that lets users convincingly swap their faces with film or TV characters has rapidly become one of the country's most downloaded apps, triggering a privacy row. From a report: Released on Friday, the Zao app went viral as Chinese users seized on the chance to see themselves act out scenes from well-known movies using deepfake technology, which has already prompted concerns elsewhere over potential misuse. Users provide a series of selfies in which they blink, move their mouths and make facial expressions, which the app uses to realistically morph the person's animated likeness on to movies, TV shows or other content.
The company was forced to issue a statement on Sunday pledging changes after critics attacked the app's privacy policy, which it had "free, irrevocable, permanent, transferable, and relicenseable" rights to all user-generated content. There has been growing concern over deepfakes, which use artificial intelligence to appear genuine. Critics say the technology can be used to create bogus videos to manipulate elections, defame someone, or potentially cause unrest by spreading misinformation on a massive scale. "We understand the concerns about privacy. We've received the feedback, and will fix the issues that we didn't take into consideration, which will take some time," a statement released by Zao said. Zao is owned by Momo Inc, a Tinder-like dating service that is listed on the US Nasdaq.
The company was forced to issue a statement on Sunday pledging changes after critics attacked the app's privacy policy, which it had "free, irrevocable, permanent, transferable, and relicenseable" rights to all user-generated content. There has been growing concern over deepfakes, which use artificial intelligence to appear genuine. Critics say the technology can be used to create bogus videos to manipulate elections, defame someone, or potentially cause unrest by spreading misinformation on a massive scale. "We understand the concerns about privacy. We've received the feedback, and will fix the issues that we didn't take into consideration, which will take some time," a statement released by Zao said. Zao is owned by Momo Inc, a Tinder-like dating service that is listed on the US Nasdaq.
Beware of stangers bearing gifts (Score:5, Insightful)
Sounds like a FANTASTIC way to populate a facial recognition database to me.
Even more so when your targets voluntarily provide you with all the data you could ever want.
Re: (Score:1)
Re: (Score:1)
Exactly. Isn't that app that showed the "old" you the same?
But that was for the Russians. Now our Chinese friends are asking for our help.
Re:Beware of stangers bearing gifts (Score:5, Informative)
yup. privacy folks cry wolf ever time something cool is released that deals with content already in the public forum.
Yes, of course, those crazy "privacy advocates". Every single time. Apart from the ones when it isn't done by a company based in an authoritarian country with actual gulags and concentration camps and a history of actually killing their political opponents. But, apart from that, every. single. time. Well, except for the ones done by companies based in places with laws allowing the government to view any data held by private companies, basically making the companies an extension of the government as in Fascism. But, with that small exception, every. single. time.
Re: (Score:1)
based in an authoritarian country with actual gulags and concentration camps and a history of actually killing their political opponents
Are you talking about China or the US?
The US has prison slavery that is similar to a gulag. Concentration camps for migrant kids. Murdering political opponents hasn't quite got started yet, but certain politicians aren't doing anything to discourage it either...
Re: Beware of stangers bearing gifts (Score:1)
Re: Beware of stangers bearing gifts (Score:1)
Awesome! (Score:2)
Narcissists everywhere will be eating this up.
Re: (Score:2)
Re: (Score:2)
How does that help? If you can fake a digital photo, it's trivial to create a film negative from it.
It's all fun and games until... (Score:4, Interesting)
... someone uses it to create a fake of Premier Li Keqiang. at which point the mass arrests will begin and the organ farms will be at capacity.
Just goes to show... (Score:5, Insightful)
Re: (Score:2)
Imagine someone respecting an agreement from a Chinese software company claiming ownership of derivative products. Not only do they feel entitled to other nation's IP, but now they want to own the user? It's as if Adobe started asking for payments on work created in Photoshop.
This technology as bread from the efforts of China to do facial recognition and oppress its people. They weed out dissent and non-compliance.
Imagine respecting an agreement like this -- where a company expects to own you because you re
just my $.02 (Score:4, Interesting)
I think the worst thing about these "deep fakes" is that if some one posted REAL images of some public figure doing something really horrible, like say raping a 9 year old, no one would believe it. And the there would always be "reasonable doubt" if it ever even got to trial..
Re: (Score:2)
This might actually be a good thing. We don't want to be overly reliant on video evidence. It should come under scrutiny and require there to be additional supporting evidence.
Re: (Score:2)
There's no problem relying on video evidence if it's reliable evidence. That is the whole issue here, that video evidence will no longer be reliable. Taking that tool away will not help get anyone closer to the truth. (What it will do is put conspiracy theorists into a frenzy.)
In case you don't know how things actually happen in court: 9 times out of 10 the only non-circumstantial evidence that exists, besides video, is personal testimony. Which is, to put it lightly, of variable quality.
Re: (Score:2)
Just in time to save someone from that Pee tape.
Re: (Score:2)
If a video was ever enough in the past, it should not have been - especially for public figures who might be targeted by powerful people with a lot of money. Since near the dawn of video, it has been within the reach of people with money to hire actors that are close look-alikes and use makeup and the fuzziness of something made to look like a surveillance video to create a more convincing video than most of these deep-fakes. Nation-state propaganda machines have certainly been doing it for decades.
I think
Still wedged in the uncanny valley (Score:3)
Re: (Score:2)
It would be nice if there was a web site or even a youtube video with examples of all the various deepfake software that's been out.
The ones I've seen have all been bad porn ones and while they don't look "real", they do look like they're getting pretty close.
Re: (Score:2)
This app is quite remarkably good and fast. Not perfect but considering that we went from it taking a desktop computer days of training a neural network on dozens of images and then a few days to process short video clips to having your phone do it in under 10 seconds, and all in the space of a year, it's remarkable.
The really interesting thing here is that it's improving so fast. It's going to be real-time and perfect very soon.
Re: (Score:2)
That is only going to be a matter of time. I've been experiementing a little recently with StyleGAN [github.com] and it does an incredibly good job at "decomposing" a face into plausible artifical faces, or well they're not CGI but like combining macro/micro properties of different faces. Then you have systems now that instead of doing the single transformation matrix deepfake that you've seen now start to reverse engineer StyleGAN parameters to match actual faces. Once you have this you can start controlling them more
A Blessing (Score:1)
professionally outraged (Score:2)
While I do not consider myself to be a conspiracy theory person, it does seem as though there is a small subset of the population who makes a living out of being "outraged" by this or that. Then in turn, the media makes money by happily headlining this fake outrage.
I never got why anyone gives a shit about what the professionally outraged say.
Like that WWI FPS game who included women and minorities in high numbers to appease the professionally outraged. I think it was BF 5, but I cannot recall exactly.
So, r
Re: (Score:2)
You will need to explain how conspiracy theorists are equivalent to "the outraged." They're disconnected from reality, but for the most part I've seen they have a calm demeanor:
The frizzy-haired "Ancient Aliens" guy seemed to get excited, but never outraged.
The guys talking about "Illuminati" want to sound enlightened, but they don't sound particularly emotional.
Moon landing deniers for the most part sound like history nerds discussing history. (Without actually knowing the history or understanding the scie
deepfakes are a good thing (Score:2)
Critics say the technology can be used to create bogus videos to manipulate elections, defame someone, or potentially cause unrest by spreading misinformation on a massive scale.
We've always been able to fake evidence, manipulate elections, defame, cause unrest by spreading misinformation, etc. It's not a new problem. It's probably as old as people have lived in community.
The only change is that there is an easy to use tool for doing it so the capability is no longer limited by cost and expertise required to those who are the most powerful. This threatens them by the loss of the capability. I say the "loss" of the capability because the eventual result should be that everyone simpl
re (Score:1)