Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Privacy The Courts AI Software Technology

Clearview AI Is Struggling To Address Complaints As Its Legal Issues Mount (buzzfeednews.com) 19

An anonymous reader quotes a report from BuzzFeed News: Clearview AI, the facial recognition company that claims to have amassed a database of more than 3 billion photos scraped from Facebook, YouTube, and millions of other websites, is scrambling to deal with calls for bans from advocacy groups and legal threats. These troubles come after news reports exposed its questionable data practices and misleading statements about working with law enforcement. Following stories published in the New York Times and BuzzFeed News, the Manhattan-based startup received cease-and-desist letters from Twitter and the New Jersey attorney general. It was also sued in Illinois in a case seeking class-action status.

Despite its legal woes, Clearview continues to contradict itself, according to documents obtained by BuzzFeed News that are inconsistent with what the company has told the public. In one example, the company, whose code of conduct states that law enforcement should only use its software for criminal investigations, encouraged officers to use it on their friends and family members. In the aftermath of revelations about its technology, Clearview has tried to clean up its image by posting informational webpages, creating a blog, and trotting out surrogates for media interviews, including one in which an investor claimed Clearview was working with "over a thousand independent law enforcement agencies." Previously, Clearview had stated that the number was around 600. Clearview has also tried to allay concerns that its technology could be abused or used outside the scope of police investigations. In a code of conduct that the company published on its site earlier this month, it said its users should "only use the Services for law enforcement or security purposes that are authorized by their employer and conducted pursuant to their employment." It bolstered that idea with a blog post on Jan. 23, which stated, "While many people have advised us that a public version would be more profitable, we have rejected the idea."
"Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only," the post stated.

But in a November email, a company representative encouraged a police officer to use the software on himself and his acquaintances. "Have you tried taking a selfie with Clearview yet?" the email read. "It's the best way to quickly see the power of Clearview in real time. Try your friends or family. Or a celebrity like Joe Montana or George Clooney. Your Clearview account has unlimited searches. So feel free to run wild with your searches."
This discussion has been archived. No new comments can be posted.

Clearview AI Is Struggling To Address Complaints As Its Legal Issues Mount

Comments Filter:
  • That's nice (Score:5, Insightful)

    by JustAnotherOldGuy ( 4145623 ) on Tuesday January 28, 2020 @09:00PM (#59666292) Journal

    That's nice, but it's really too late now, isn't it?

    They got all your pics and there's no way to un-take them. Now they're in a database that will be used for who knows what? Identity theft, fraud, impersonation, etc etc etc.

    It makes me glad that I didn't post pics of myself anywhere. :)

    I won't say "you did this to yourself", but you kinda did.

    • [Too late] they got all your pics and there's no way to un-take them

      But you can flood them with near matches. Use the 30-odd SnapChat filters to stuff the WebTubes with your image as a chorus girl, a pimp, a baby, a rapper, a geezer, a super-hero, a tiger, a miser, a rabbit etc., then recombine each combo to be a baby rapper chorus girl rabbit-tiger etc. It's more than a billion combo's. Searching you will then confuse the bots to high heck. Finally a practical use for that silly party toy.

      • But you can flood them with near matches.

        I like the idea, but I suspect it'll only help train the facial recognition algorithms to be more precise.

        It's just giving them lots more samples to work on so as to get better at detecting that it's you.

        • by gl4ss ( 559668 )

          it's giving them more samples but is it getting better at detecting that it's you or a rapper?

          anyway, they either can or can not. more than that people could flood them with gdpr requests to get data associated with you and ask them to delete it.

          there's also hundreds of thousands of fake facebook accounts that have just plain taken photos from elsewhere. their data-set is tainted from the beginning. furthermore I don't quite believe they have the algorithm going so well that it can detect a face from the wh

          • I don't quite believe they have the algorithm going so well that it can detect a face from the whole 3 billion set anyhow.

            Yeah, you're totally right- no computer will ever be able to search a dataset of 3 billion items. Never been done, never will be done, it's impossible. No database could ever even hold so many rows, never! And lucky for us the technology will never, ever get better or faster, so rest easy, my friend.

            I hear that we're never going to go to the Moon, either. I mean, the very idea is crazy, amirite? Although scientists do say that "someday in the future" we'll be able to make a phone call to the other side of

    • by AmiMoJo ( 196126 )

      Well you can hit them with a GDPR request, or better still if you are in the US lobby your politicians to confiscate and destroy their database.

    • It makes me glad that I didn't post pics of myself anywhere. :) I won't say "you did this to yourself", but you kinda did.

      If only it were that simple. People take pictures of others all the time and upload them without even thinking about it. Whether it be for work related events, parties, etc. I kept a low profile on facebook back when I used it but that didn't stop family members from uploading photos and tagging me in them.

    • I've actually run my face through Clearview. What came up was Flickr pictures posted by others and a sense of violation.

      What I want to know is what the tech journalists are apparently too dumb to ask: do pictures sent to Clearview as queries get absorbed as part of the training data too?

  • Just file bankruptcy and work for the CIA under a fat contract to manage it all. They are the real users of that data anyhow.

  • would be to combine DNA databases with image databases.

    All the major DNA player like 23and Me or AncestryDNA probably would happily sell the data (it would be harder for Clearview to just steal than Twitter or BookFace images)

  • Evil company (Score:4, Insightful)

    by registrations_suck ( 1075251 ) on Tuesday January 28, 2020 @10:09PM (#59666444)

    Needs to be put down, with all assets forfeited and all facial images deleted.

  • by AK Marc ( 707885 ) on Tuesday January 28, 2020 @10:20PM (#59666472)
    If the AI worked, wouldn't it have predicted this?
  • by misnohmer ( 1636461 ) on Tuesday January 28, 2020 @11:09PM (#59666610)

    Unless search engines are banned, there isn't much illegal happening. What they do is no different than google search engine does for text information - scrape and index publically posted information. Law enforcement agencies have been using google, people googling themselves, or their friends, for a long time now - nothing new here.

  • So if they scraped my selfie which I hold the copyright on and are using it for commercial purposes . . . ?
    • So if they scraped my selfie which I hold the copyright on and are using it for commercial purposes . . . ?

      You might have a case. Probably you agreed to an EULA on the service you uploaded the photo to that grants them perpetual royalty-free usage of that photo, but that wouldn't apply to a scraper. If they actually did scrape their collection, in violation of EULAs, rather than buy access to all those photos, there's a class action with 3 billion members waiting.

      Odds are they paid for access though. You don't scrape 3 billion photos from anywhere without the site operators noticing. At which point how legal

Scientists will study your brain to learn more about your distant cousin, Man.

Working...