Google, YouTube and Venmo Send Cease-and-Desist Letters To Facial Recognition App That Helps Law Enforcement (cbsnews.com) 54
Google, YouTube and Venmo have sent cease-and-desist letters to Clearview AI, a facial recognition app that scrapes images from websites and social media platforms, CBS News has learned. The tech companies join Twitter, which sent a similar letter in January, in trying to block the app from taking pictures from their platforms. From the report: Clearview AI can identify a person by comparing their picture to its database of three billion images from the internet, and the results are 99.6% accurate, CEO Hoan Ton-That told CBS News correspondent Errol Barnett. The app is only available to law enforcement to be used to identify criminals, Ton-That said. "You have to remember that this is only used for investigations after the fact. This is not a 24/7 surveillance system," he said. But YouTube, which is owned by Google, as well as Venmo and Twitter say the company is violating its policies. [...] In addition to demanding that Clearview AI stop scraping content from Twitter, the social media platform demanded that the app delete all data already collected from Twitter, according to an excerpt of the cease-and-desist letter given to CBS News. Update: LinkedIn is joining the party.
Even the DMCA can't stop law enforcement (Score:2)
Even the DMCA can't stop law enforcement
Re: Even the DMCA can't stop law enforcement (Score:2)
It doesn't work against anonymous entities, but named ones are another story.
Good luck with that (Score:2)
Re: (Score:2)
Sure, TOS, sure (Score:1)
Youtube is a site where anyone anywhere can upload anything from instructions on how to build a fishing pole to reviews of baking grease to walkthroughs of 30 year old video games. But they're somehow not a public venue. Nope.
Comment removed (Score:5, Insightful)
it's a technology (Score:4, Insightful)
Keep trying to un-invent facial recognition. People like me will keep telling you technologies can't be un-invented.
You have to focus on reforming the government power structures you don't trust. Or eliminate them if you can't reform them -- and then replace them with a new, redesigned agency if you still need their work done.
Prohibiting them from using a universally available technology isn't going to work. If they can't be trusted, they'll use it anyway.
Re: (Score:2)
It's all about money. Do you honestly think those companies give 2 shits about us and are doing this to help protect us?
They clearly see how lucrative this can be, and Clearview probably won't sell to them. So now we get to see how children behave when they don't get what they want.
Re: (Score:3)
You can't uninvent technology, but you can regulate its implementation.
- We learnt how to split the atom, but we've limited our use of nuclear power and weapons. :-)
- We learnt how to clone, but we've limited cloning of human beings.
- We invented video calling, but prefered text messages instead
We invented facial reconition, but we can and should limit its implementation if we perceive a risk to society.
This whole "you can't stop technology" is a technological determinist narrative that Silicon Valley has b
Re: (Score:2)
Attempts to regulate or prohibit facial recognition aren't going to work. Use of it is undetectable. And anyone can set it up anywhere on the Internet, out of reach of your regulations but still available for all to use.
Pushing on this rope is a futile distraction from solving the real problems.
99.6 Percent (Score:2)
With 3 billion choices and 99.6% accuracy, that's only 12,000,000 mismatch errors for the at most 3,000,0000,000 people in the DB and of course 100% mismatch for those who are not.
Re: (Score:2)
Re: (Score:2)
The poor accuracy makes it useless. 1 chance to be right. Many more chances to be wrong.
The same thing happened with DNA evidence - Millions to one P values in a population of billions and incorrect convictions were made as a result.
That's rich ... (Score:1)
They want in. (Score:2)
They most likely tried to buy Clearview, and failed miserably. Now they're going to do everything they can to make THEM miserable, then either buy them out, or release their own service.
Re: (Score:2)
If you don't want (Score:2)
people using your picture, then maybe you shouldn't put it on the internet.
Re:If you don't want (Score:5, Interesting)
Sorry, but putting a photo online doesn't mean you're releasing it into the public domain for anyone to use as they like. Photos are still copyrighted and using them - especially in a commercial product - is still copyright infringement. To give an example, if I post a vacation photo to Instagram, that doesn't give a vacation planning company the right to use my photo in their advertisement.
Re: (Score:3)
This company isn't infringing any copyright. All they are doing is correlating publicly facing data.
Re: If you don't want (Score:5, Insightful)
They downloaded the image from Twitter, Google, etc. and then proceeded to use the copyright protected image for commercial purposes, for which they have no authorization from the copyright holder to do (Twitter/Google etc. on the otherhand did have the copyright holders permission). Where I live (the UK) that's open and shut breaking of copyright law and because it is for commercial gain it means they can be hit with big fines.
Basically they are making unauthorized copies of copyright protected material, which is a no no. Doing it for commercial gain makes it way worse. In the USA they are liable for trillions of dollars in damages.
Re: (Score:2)
Sounds like a grey area. Is creating a data set from the pictures copyright infringement? Now you're using derivative data to compare samples against.
Re: (Score:2)
Laws don't work by vague analogy. They have text. It tends to have a very specific meaning. (And when it doesn't, it's to take away your rights, not to grant you new ones.)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I won't. But how do I stop people who don't have my permission, but got my picture one way or another?
AI to identify porn actors/actresses: done (Score:2)
Pornhub advertised in 2017 that it is using technology to index videos based on the actors in the video. This should EASILY be usable to identify the people that are involved in such unwilling content, and automatically block it. https://www.techspot.com/news/71365-pornhub-using-artificial-intelligence-identify-porn-stars.html. The fact that they are using "fingerprinting" of videos instead of this advertised functionality shows they are using the lowest-bar techniques to remove content, but high-tech te
Re: IT SHOULD/MUST BE MADE "FAIR USE"!!! (Score:2)
Except it isn't
Shoe on the other foot? (Score:1)
Not taking sides on the specifics, but didn't a similar situation occur when news publishers complained when browsers linked to articles?
Public data only (Score:2)
Easy fix (Score:1)
If you're dumb enough to put your photo (Score:2)
on a social media platform, then you deserve whatever someone else might do with that photo. There's nothing else to discuss.
Gestapo upset they can't break the law (Score:2)
I has the sads.
Poor bloody LinkedIn (Score:1)
For reals, Google? (Score:1)