Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Google Privacy The Courts Youtube Your Rights Online

Google, YouTube and Venmo Send Cease-and-Desist Letters To Facial Recognition App That Helps Law Enforcement (cbsnews.com) 54

Google, YouTube and Venmo have sent cease-and-desist letters to Clearview AI, a facial recognition app that scrapes images from websites and social media platforms, CBS News has learned. The tech companies join Twitter, which sent a similar letter in January, in trying to block the app from taking pictures from their platforms. From the report: Clearview AI can identify a person by comparing their picture to its database of three billion images from the internet, and the results are 99.6% accurate, CEO Hoan Ton-That told CBS News correspondent Errol Barnett. The app is only available to law enforcement to be used to identify criminals, Ton-That said. "You have to remember that this is only used for investigations after the fact. This is not a 24/7 surveillance system," he said. But YouTube, which is owned by Google, as well as Venmo and Twitter say the company is violating its policies. [...] In addition to demanding that Clearview AI stop scraping content from Twitter, the social media platform demanded that the app delete all data already collected from Twitter, according to an excerpt of the cease-and-desist letter given to CBS News. Update: LinkedIn is joining the party.
This discussion has been archived. No new comments can be posted.

Google, YouTube and Venmo Send Cease-and-Desist Letters To Facial Recognition App That Helps Law Enforcement

Comments Filter:
  • Even if they were successful, it does nothing to prevent some third party from scraping all of this data that the companies and the users of their platform make public and selling the whole pile to Clearview AI or some other company doing the same thing. People need to realize that they shouldn't be so cavalier in slapping their data up all over the internet and leaving it open to the rest of the world. There might be a bit more of an argument if the data was only being shared privately and not with every o
    • It is also very easy for Twitter to run some of their images through Google Image Search and determine if anyone is using them, regardless of where they were directly sourced from.
  • Youtube is a site where anyone anywhere can upload anything from instructions on how to build a fishing pole to reviews of baking grease to walkthroughs of 30 year old video games. But they're somehow not a public venue. Nope.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday February 06, 2020 @11:12AM (#59697836)
    Comment removed based on user account deletion
  • it's a technology (Score:4, Insightful)

    by Kohath ( 38547 ) on Thursday February 06, 2020 @11:19AM (#59697862)

    Keep trying to un-invent facial recognition. People like me will keep telling you technologies can't be un-invented.

    You have to focus on reforming the government power structures you don't trust. Or eliminate them if you can't reform them -- and then replace them with a new, redesigned agency if you still need their work done.

    Prohibiting them from using a universally available technology isn't going to work. If they can't be trusted, they'll use it anyway.

    • It's all about money. Do you honestly think those companies give 2 shits about us and are doing this to help protect us?
      They clearly see how lucrative this can be, and Clearview probably won't sell to them. So now we get to see how children behave when they don't get what they want.

    • You can't uninvent technology, but you can regulate its implementation.

      - We learnt how to split the atom, but we've limited our use of nuclear power and weapons.
      - We learnt how to clone, but we've limited cloning of human beings.
      - We invented video calling, but prefered text messages instead :-)

      We invented facial reconition, but we can and should limit its implementation if we perceive a risk to society.

      This whole "you can't stop technology" is a technological determinist narrative that Silicon Valley has b

      • by Kohath ( 38547 )

        Attempts to regulate or prohibit facial recognition aren't going to work. Use of it is undetectable. And anyone can set it up anywhere on the Internet, out of reach of your regulations but still available for all to use.

        Pushing on this rope is a futile distraction from solving the real problems.

  • With 3 billion choices and 99.6% accuracy, that's only 12,000,000 mismatch errors for the at most 3,000,0000,000 people in the DB and of course 100% mismatch for those who are not.

    • Comment removed based on user account deletion
      • The poor accuracy makes it useless. 1 chance to be right. Many more chances to be wrong.

        The same thing happened with DNA evidence - Millions to one P values in a population of billions and incorrect convictions were made as a result.

  • Google, YouTube, Facebook, etc have built their multi-billion dollar businesses on collecting personal images and profiles of their own users and others, based on implied consent which they then sell to whomever for whatever purpose. Yet when another company sells services based on data mined from their sites, they want 'to cease and desist'. It's all a bit rich.
  • They most likely tried to buy Clearview, and failed miserably. Now they're going to do everything they can to make THEM miserable, then either buy them out, or release their own service.

  • people using your picture, then maybe you shouldn't put it on the internet.

    • Re:If you don't want (Score:5, Interesting)

      by Jason Levine ( 196982 ) on Thursday February 06, 2020 @11:38AM (#59697948) Homepage

      Sorry, but putting a photo online doesn't mean you're releasing it into the public domain for anyone to use as they like. Photos are still copyrighted and using them - especially in a commercial product - is still copyright infringement. To give an example, if I post a vacation photo to Instagram, that doesn't give a vacation planning company the right to use my photo in their advertisement.

      • This company isn't infringing any copyright. All they are doing is correlating publicly facing data.

        • by jabuzz ( 182671 ) on Thursday February 06, 2020 @02:06PM (#59698714) Homepage

          They downloaded the image from Twitter, Google, etc. and then proceeded to use the copyright protected image for commercial purposes, for which they have no authorization from the copyright holder to do (Twitter/Google etc. on the otherhand did have the copyright holders permission). Where I live (the UK) that's open and shut breaking of copyright law and because it is for commercial gain it means they can be hit with big fines.

          Basically they are making unauthorized copies of copyright protected material, which is a no no. Doing it for commercial gain makes it way worse. In the USA they are liable for trillions of dollars in damages.

          • Sounds like a grey area. Is creating a data set from the pictures copyright infringement? Now you're using derivative data to compare samples against.

      • by Kohath ( 38547 )

        Laws don't work by vague analogy. They have text. It tends to have a very specific meaning. (And when it doesn't, it's to take away your rights, not to grant you new ones.)

      • That analogy doesn't work, as in that case you have a direct copy. For the app, they are very likely just breaking the image down into essential usable element (ratios between various bits, colors, etc). This causes two problems with the argument. 1 it is systematic information gathered from the pic, not a creative expression. 2 it is effectively an interpretation of the pic, not a copy of it. The only legal barrier here is the TOS of the sites, but that's a pretty weak barrier.
    • Sounds great until somebody else puts your picture on the internet, namely Facebook, without your permission.
    • by memnock ( 466995 )

      I won't. But how do I stop people who don't have my permission, but got my picture one way or another?

  • Pornhub advertised in 2017 that it is using technology to index videos based on the actors in the video. This should EASILY be usable to identify the people that are involved in such unwilling content, and automatically block it. https://www.techspot.com/news/71365-pornhub-using-artificial-intelligence-identify-porn-stars.html. The fact that they are using "fingerprinting" of videos instead of this advertised functionality shows they are using the lowest-bar techniques to remove content, but high-tech te

  • Not taking sides on the specifics, but didn't a similar situation occur when news publishers complained when browsers linked to articles?

  • "Google can pull in information from all different websites," he said. "So if it's public and it's out there and could be inside Google search engine, it can be inside ours as well"
  • Google can easy morph some of the pictures to make their matching fall off... give user this option when files are uploaded as a privacy measure.... a small increase of distance between the eyes, add a few shadows, move nose slightly up, etc etc... im sure you can add ramdon stuff to pics where the AI wont match shit
  • on a social media platform, then you deserve whatever someone else might do with that photo. There's nothing else to discuss.

  • by Anonymous Coward
    Given that they're one of the biggest offenders for scraping public data. They scraped my name and IETF-related email address from the public message lists, setup a LinkedIn profile "for me" and proceeded to regularly spam me to encourage me to finish my profile to engage with the professional community.
  • Wait... you mean Google, who has a search dedicated specifically for searching images scraped off the Internet, including reverse searches for the origin of images, is unhappy that someone else is scraping images off the Internet? They can go EABOD.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...