Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Businesses Privacy

Clearview AI's Facial Recognition Tool Coming To Apps, Schools (gizmodo.com) 16

An anonymous reader quotes a report from Reuters: Clearview AI is expanding sales of its facial recognition software to companies from mainly serving the police, it told Reuters, inviting scrutiny on how the startup capitalizes on billions of photos it scrapes from social media profiles. [...] Clearview primarily helps police identify people through social media images, but that business is under threat due to regulatory investigations. The settlement with the American Civil Liberties Union bans Clearview from providing the social-media capability to corporate clients.

Instead of online photo comparisons, the new private-sector offering [called "Clearview Consent"] matches people to ID photos and other data that clients collect with subjects' permission. It is meant to verify identities for access to physical or digital spaces. Vaale, a Colombian app-based lending startup, said it was adopting Clearview to match selfies to user-uploaded ID photos. [...] Clearview AI CEO Hoan Ton-That said a U.S. company selling visitor management systems to schools had signed up as well. He said a customer's photo database is stored as long as they wish and not shared with others, nor used to train Clearview's AI. But the face-matching that Clearview is selling to companies was trained on social media photos. It said the diverse collection of public images reduces racial bias and other weaknesses that affect rival systems constrained by smaller datasets.
The company outlined their path forward in a press release Wednesday.

"Today, FRT is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payment," Clearview AI CEO Hoan Ton-That said in a statement. "Now, we are offering companies who use facial recognition as part of a consent-based workflow access to Clearview AI's superior, industry-leading FRT algorithm, bringing an increased level of security and protection to the marketplace."

He added: "Using facial recognition as a preventative measure means fewer crimes and fewer victims. Ultimately, Clearview Consent is all about making everyday consumers feel more secure in a world that is rife with crime and fraud."
This discussion has been archived. No new comments can be posted.

Clearview AI's Facial Recognition Tool Coming To Apps, Schools

Comments Filter:
  • Title correction: (Score:4, Insightful)

    by Sebby ( 238625 ) on Wednesday May 25, 2022 @05:13PM (#62566266)

    Privacy Rapists' Facial Recognition Tool Coming To Apps, Schools

    There FTFY.

    • But didn't you hear it helps with racial bias which means they're going to get all the poors whether you're black white Latino or Asian, they want to herd ALL of the poors so the betters will know no Fight Club stuff be happening. 1984 and Brave New World, etc. Weren't supposed to be training materials.
    • by AmiMoJo ( 196126 )

      They were actually just fined this week in the UK for privacy violations, with a requirement that they delete all data relating to UK citizens.

  • The only criminal (Score:4, Insightful)

    by NotEmmanuelGoldstein ( 6423622 ) on Wednesday May 25, 2022 @06:10PM (#62566356)

    ... part of a consent-based workflow ...

    How much consent is there, when the alternative is no home, no travel and no shopping? Soon, the only criminal will be your government.

    ... everyday consumers feel more secure ...

    A government-mandated identity prevents identity theft and geo-fencing allows identification of probable criminals after the fact. None of this prevents realtors submitting fake loan applications, US colleges performing bait-and-switch college loans, or pricing bubbles over cryptocurrency and financial derivatives (options on options). In short, greed, and the corporations inciting it, can still destroy the economy and people's future.

    • What? You thought that the constant surveillance was supposed to catch important criminals? Hah! No, peon. It's to keep you in line and on tap for them to milk when they feel like it.

      Good grief. Next the peons will be claiming that government is supposed to work for and protect them and not us. Hahahahaha!
  • If a technology exists that makes the tireless job of cops and corporations a nonzero bit easier, it will be exploited.

    Attempts to hamstring this exploitation will be met with roadblocks and work-arounds.

  • Every comment in any article about the TSA was about how the TSA was going to expand to malls trains and schools. A few of the 6 digit uid commenters said nooo that is never going to happen.

    Folks clear view is the privatized version of TSA. And itâ(TM)s in your malls schools trains and stadiums just as low uid slashdotters predicted. The chilling part is that we have absolutely no say over this

  • Much of your cellphone (and possibly web) activity is tracked, automatic license plate readers detect where you drive. There's a clear credit and debit card record of everything you buy. Very soon now, facial recognition will track almost every building you enter and where you are walking down the block.

  • So they're going to spend a metric butt-ton of cash for a system that's little more than techno-voodoo because b.s. reasons.

    I love tech, and facial recognition is rather cool, but don't fool yourself, despite what marketing tries to tell you, it utterly sucks outside the lab. Actually it's not that great in the labs yet either, but nobody is buying it for a laboratory.

    False positives, false negatives, and a system that's either too tight on parameters that it increases one or so light it increases the other
    • ... And we haven't even brought up the massive biases due to faulty AI creating racial based issues.)

      Didn't you RTFA? they claimed that because they scraped every photo they could from Facebook, its not racially biased AI... /sarcasm

  • by rantrantrant ( 4753443 ) on Thursday May 26, 2022 @04:30AM (#62567176)
    Now imagine if schools & districts spent that money on teachers, teacher professional development programmes (increase the quality of teaching & the time teachers stay in the job - experienced teachers usually get better results), books, repairing buildings, & clearing out all the old unused laptops, tablets, smartboards, etc.. We've known for decades that EdTech has little to no impact on learning outcomes & some have negative impacts. The OECD found a general inverse correlation between ICT use & academic outcomes across all the countries it studied. Yes, there is potential for academic benefits with some particular applications of ICT but that's not what is happening in our education systems. Adding expensive surveillance systems to education budgets when they desperately need better teaching is just madness & probably corrupt.
  • "Today, FRT is used to unlock your phone, verify your identity, board an airplane, access a building, and even for payment"

    ... but by what percentage, relatively speaking, of people? And how many of them actually choose to use it?

    I don't use FRT to unlock my phone or verify my identity, ever - I don't trust the technology, and philosophically I don't want to rely heavily on something I can't easily replace if it's compromised (i.e. it's a lot easier to get a new plastic card with a different EMV chip than a new phone, or a new face). I can't answer the "board an airplane" question, but that's its own relatively uncommon subset

    • Anytime I've been asked for any biometric data, I always respond with "no thank you" even when it makes no sense based upon what they just said. I've had Airlines, Hospitals and Hairdresser asking me to use biometrics to "make things easier".
      How is facial recognition easier for boarding a plane than scanning a barcode as you walk onto the jetway?
      Why would I give a hospital my palm print to identify me, when they still scan the barcode around my wrist?
      wtf does my hairdresser need any biometrics ever?.

      Wha

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...