Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Privacy The Courts

FTC Settlement With Ever Orders Data and AIs Deleted After Facial Recognition Pivot 20

The maker of a defunct cloud photo storage app that pivoted to selling facial recognition services has been ordered to delete user data and any algorithms trained on it, under the terms of an FTC settlement. TechCrunch reports: The regulator investigated complaints the Ever app -- which gained earlier notoriety for using dark patterns to spam users' contacts -- had applied facial recognition to users' photographs without properly informing them what it was doing with their selfies. Under the proposed settlement, Ever must delete photos and videos of users who deactivated their accounts and also delete all face embeddings (i.e. data related to facial features which can be used for facial recognition purposes) that it derived from photos of users who did not give express consent to such a use. Moreover, it must delete any facial recognition models or algorithms developed with users' photos or videos.

This full suite of deletion requirements -- not just data but anything derived from it and trained off of it -- is causing great excitement in legal and tech policy circles, with experts suggesting it could have implications for other facial recognition software trained on data that wasn't lawfully processed. Or, to put it another way, tech giants that surreptitiously harvest data to train AIs could find their algorithms in hot water with the US regulator.
This discussion has been archived. No new comments can be posted.

FTC Settlement With Ever Orders Data and AIs Deleted After Facial Recognition Pivot

Comments Filter:
  • The reason they got in trouble isn't that they used user data for training their facial recognition AI, it's that they did it without burying a clause in their EULA allowing it.

    Does anyone think that even now, if they read their Facebook EULA that there isn't some vague clause authorizing them to do whatever they want with their data for "business purposes"?

    This kind of thing will only affect small companies with bad lawyers.
    • PROTIP: Contracts do not override the law.
      I know corporations disagree. But we don't care.

      • PROTIP: Contracts do not override the law. I know corporations disagree. But we don't care.

        To the parents point about smaller/weaker corporations, larger corporations disagree and don't care for a reason. Because they already know breaking the law is worth it for them, because powerful lawyers and lobbyists.

        This action, does nothing but eschew in the era of mega-corps running it all. Smaller businesses will crumble under rule of law and regulation. Mega-corps will simply buy their regulation. Greed N. Corruption, can be read like a book.

      • Yeah, but law is generally barred from being retroactive -- you can't make any action become criminal "ex post facto". So the contracts suffice for anything prior to us putting laws on the books about facial recognition. If we try to criminalize it today, we'll run into all sorts of arguments about depriving people of property/means without just compensation, and I honestly can't begin to guess how those cases would play out. It would take some serious legal jockeying to ban facial recognition at this point

    • In Europe the GDPR (and the privacy laws that came before it) won't allow you to use data for purposes that you didn't collect it for in the first place. This is called "purpose limitation". It forces organisations to be clear about what goal they collect data for, precisely to avoid a "grab now, expand later" mentality.

      I'm not sure if you can circumvent this by putting "we may expand use cases later" in your privacy policy. I suspect you'll have to be specific. But IANAL.

  • If yoi break the law, you get slapped.
    I know this is a new concept to you.
    Enjoy!

  • by rossdee ( 243626 ) on Wednesday January 13, 2021 @08:27PM (#60940818)

    Incomprehensible to me.

    • by Luthair ( 847766 )
      Yea, I was going to ask if it was just me.
    • Companies naming themselves common nouns and verbs seems to be an epidemic these days, use some imagination instead of the copy and paste drivel that has entered Company names, and sadly programming.
    • FTC Settlement With Ever Orders Data and AIs Deleted After Facial Recognition Pivot

      FTC, we know what that is
      Settlement, we know what that is too
      Ever must be a company because FTC
      We know what an order is
      We know what data and AIs are
      We know what deletion is
      We know what facial recognition is
      We know that pivoting is changing direction and/or policy in this sort of context

      What word did you find confusing?

    • by AmiMoJo ( 196126 )

      It's because they gave their stupid app the name "Ever", and then in a fit of irony shut it down after a short time.

      Some quote marks would have helped. This is an important story, very good news. You can't bait and switch people, can't get them to give you a massive free database of their photos to train your (presumably shit) AI.

  • This is fine and dandy, but how in the hell is the FCC going to validate compliance. I seriously doubt that just wiping Ever's servers is going to do any good. I'm thinking that algorithms that Ever's created with stolen data are going to be around for-Ever.
    • by tlhIngan ( 30335 )

      Basically the order just makes the data and the AI toxic for use.

      It's effectively a big kiss of death for them because you can't prove a negative - how do they prove they deleted the data?

      If they develop a new facial recognition product, the onus is on them to prove that none of the data they were ordered to destroy was used in that product, which is basically impossible. They will need to keep extremely good logs on this.

      In fact, about the only way is to literally burn the company to the ground by destroyi

  • These AIs become inscrutable black boxes to us meatbags quite rapidly. How would anyone determine whether illegally/unethically collected data is used for training, unless there is some sort of "evil bit"? Hey that sounds like an idea, I should write an RFC...

    But seriously, is there any way to reverse-engineer an AI's training set? If not, then the only way I can see that data ever truly going away is when the line(s) of AI that were trained on it are allowed to "go extinct", as they cease to be useful.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...