Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Privacy Businesses Technology

Millions of People Uploaded Photos To the Ever App. Then the Company Used Them To Develop Facial Recognition Tools. (nbcnews.com) 63

An anonymous reader shares a report: "Make memories": That's the slogan on the website for the photo storage app Ever, accompanied by a cursive logo and an example album titled "Weekend with Grandpa." Everything about Ever's branding is warm and fuzzy, about sharing your "best moments" while freeing up space on your phone. What isn't obvious on Ever's website or app -- except for a brief reference that was added to the privacy policy after NBC News reached out to the company in April -- is that the photos people share are used to train the company's facial recognition system, and that Ever then offers to sell that technology to private companies, law enforcement and the military. In other words, what began in 2013 as another cloud storage app has pivoted toward a far more lucrative business known as Ever AI -- without telling the app's millions of users.
This discussion has been archived. No new comments can be posted.

Millions of People Uploaded Photos To the Ever App. Then the Company Used Them To Develop Facial Recognition Tools.

Comments Filter:
  • Lately there seem to be lots of these companies, all with millions of users, which I’ve never even heard of. Guess maybe I’m getting old and out of the loop.

    • Too bad you can't change your username.

      • Actually it's worse than that: his UID suggests he joined circa 2002, meaning he owned a 9 year old el-cheapo station wagon at the time that he was proud enough of to make his username :)

        • Actually it's worse than that: his UID suggests he joined circa 2002, meaning he owned a 9 year old el-cheapo station wagon at the time that he was proud enough of to make his username :)

          Don’t know if “proud” is the right word, but I was definitely looking for an unusual username and did indeed own a 1993 Ford Escort GL Wagon. Actually I still owned it up until the tail end of last year.

          It was an el cheapo car, for sure, but it was quite reliable.

    • "Popular" comes from "people", and 90% of the population is hopelessly stupid. Anything wildly popular is suspicious to the thinking man. Not instantly flocking to the latest fad and realizing later that you didn't really miss a thing is called being wise.

      • ...and 90% of the population is hopelessly stupid.

        Which why I was about to say, "caveat emptor," but that really doesn't apply to this situation, by any stretch.

    • Lately there seem to be lots of these companies, all with millions of users, which I’ve never even heard of. Guess maybe I’m getting old and out of the loop.

      Yeah, me too. And I wouldn't have it any other way. The more out of the loop I am, the happier and more peaceful my life seems to be.

  • I tried to find 'Ever' the company, to check out their terms and conditions, but ever's a fairly common word.

    Oh well, since I can't find the company, I might as well check a word frequency list. Wow! 'Ever' is the 319th most common English word [duboislc.org]! More popular than that company.
  • by 110010001000 ( 697113 ) on Thursday May 09, 2019 @04:01PM (#58565888) Homepage Journal

    So what? They aren't your photos. You gave them to the company. For free.

    • So what? They aren't your photos. You gave them to the company. For free.

      Yep, and people just can't seem to grasp that no matter how many times stuff like this happens.

  • by Anonymous Coward

    Your PC and phone are spying on you, maybe your TV, too, your car, all that information is used against you, but corporations, but political parties, by our government and the governments of hostile foreign countries.

    Nothing good is coming out of Silicon Valley anymore. Greed and more greed and we won't fully understand the damage we're doing for another generation or two.

  • by misnohmer ( 1636461 ) on Thursday May 09, 2019 @04:07PM (#58565916)

    Any time you give your data to another company to store or process, you should foresee that the company may use the data not the way you would prefer. Damages are usually limited to up to a full refund for the services paid, which for free services means you can claim a whopping $0.00.

    • by Anonymous Coward

      I always tell this to people, but they never listen. "How am I going to share things with my friends and family if not on social media?" As if we never found ways to do that before social media was even a thing.

    • And in these cases "other people's data" (their faces).

      I do think there's a reasonable expectation of privacy at Aunt Mildred's birthday party.

      Sure, a few years ago a few people in the photo-processing chain would see a picture of you at the party, but that's it outside of family and friends. But now your stupid cousin Ralph uploads a picture to sone cloud app and all of a sudden thousands of strangers are seeing you at the party and now using your face to train neural nets for killer robots.

      That's one hel

  • When you expect someone to keep your data for free without any compensation. When you tell someone gossip but tell them it is secret.
  • "Grandpa" is the name of my dog [petpaw.com.au]. Enjoy your training data.

  • by theCat ( 36907 )

    And what are all the 23andMe and Ancestry and what not are really doing with our DNA?

  • Reading the article and the link to terms of service, Ever is not doing anything they said they wouldn't.

    Whether it's Jennifer Lawrence's hacked nude selfies or this, the lesson is the same, if you don't want somebody to use your images, don't store on the cloud.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Sounds like you're missing the notice that they added it to the terms of service after a news agency questioned them about it. In other words, their terms of service was lying to people for years. The takeway is that not even terms of services can be trusted and companies will lie about anything and everything with no punishments and no recourse for you.

      So smart phoones are unusable and any desktop software needs to be OSS and compiled by you. You also need to write your own browser, setup a DNS server,

  • Just a shakedown (Score:5, Insightful)

    by Gavagai80 ( 1275204 ) on Thursday May 09, 2019 @04:32PM (#58566040) Homepage

    This is like getting a big 20th century photo album out of your closet, handing it to someone asking them to re-organize the photos for you... and then complaining when you find out they actually looked at the photos and remember some of them. Perhaps they even *gasp* learned things about the world from your vacation photos which help them. Time to demand they either pay up or go into the chair for a memory wipe!

    If the company were redistributing the photos to someone else, there'd be a problem. They aren't, so there's no copyright law involved even if you didn't give them a license. The faux-outrage is presumably just somebody's attempt to cash in: a company made money, so now everybody wants a cut. Lawyers are hoping to turn the right to privately use things people have given you in ways that don't affect them at all into some new form of intellectual property that makes copyrights and trademarks seem permissive.

    • by Anonymous Coward

      Let's save our outrage for the companies that are actually selling our data to third parties or using our data to target us with ads. This is about as close to doing nothing with your data as you could hope for when using a free service.

    • by noodler ( 724788 )

      This is like getting a big 20th century photo album out of your closet, handing it to someone asking them to re-organize the photos for you... and then complaining when you find out they actually looked at the photos and remember some of them.

      Not really. It's more like trusting a company with your photos but then the company makes a copy of your photos (of everyone's photos actually) and uses that to make money for themselves without telling you or anybody else. Meanwhile the use by the company leads to the development of a product that you don't get payed for and that has a distinct risk of being used against you.

  • by the_skywise ( 189793 ) on Thursday May 09, 2019 @04:44PM (#58566080)
    How did you think they were paying for and maintaining all that wonderful storage? That they were some sort of benevolent non-profit charity? Making it up in volume? (see the bank that only makes change skit)
    The automatic assumption of anything you store in the cloud is that it will be indexed AND looked at by other eyes (both real and technological) - even if the EULA doesn't point that out. You have a reasonable assumption of privacy for things stored on the cloud (that your data won't be resold or leaked to other users) but not that the data won't be compiled and used in other ways as part of statistical analysis. So long as no private data was leaked or sold to other parties I don't think the company is in the wrong here making algorithms
    And if you're freaked out about this - what do you think the credit card companies are doing with your financial records?
  • by fred911 ( 83970 ) on Thursday May 09, 2019 @05:00PM (#58566168) Journal

    Just because a service doesn't require you to pay doesn't mean the TOS doesn't explain the deal. Apparently people for some unknown reason provided biometric data to some company and gave them permissions of use of that data, for some unknown reason.

    I don't hear anyone saying the company broke the deal.

    See title

  • by Anonymous Coward

    Back in the day when you dropped off your camera film to the 2-hour (or one-day) film processing there were people (ie, the machine operator) looking at the photos you took. Yup, if you took pictures of your naked girlfriend, they saw it. And you know what else? If your naked girlfriend was hot enough, an extra bunch of prints were made "for future reference". He, he.

    • by Pyramid ( 57001 )

      And if the owner caught you doing that, they'd fire you on the spot. ...and keep the photos for themselves.

  • by Anonymous Coward

    Just about everything you do these days is used to train neural nets n such. Games can learn strategy, stores learn shopping habits, politicians learn key hot topics n voting habits, /. learns trolling, etc, etc. In a surreal sense, we're not just optimizing the NNs, but the entire human race, and the data providers (us) will ultimately lose out in both the short and long term.

  • All large companies have face recognition software and use them with user photos. Also they sell technology to law enforcement.
  • So Ever appropriated other people's private data (they didn't agree to provide it for AI training) to create a commercial product. Seems like they owe all those people a major cut from their revenue stream. Say, $2 to be dispersed among the users for every $1 pocketed by the company. A lawsuit seems in order.

  • by Anonymous Coward

    And sometimes you're the product even when you do pay. We would all do well to remember that.

  • This is why (Score:4, Informative)

    by Gonoff ( 88518 ) on Friday May 10, 2019 @03:17AM (#58567726)

    You need the GDPR. It simply outlaws any change of use of my data without my being asked first. They would need to ask every single person whose data is used.

    But don't worry. Your rulers are even less keen than your politicians for this to happen...

  • by daveime ( 1253762 ) on Friday May 10, 2019 @06:16AM (#58568046)
    Skynet gains the ability to recognize some twat grinning inanely over a spaghetti carbonara.
  • you are the product.

Every successful person has had failures but repeated failure is no guarantee of eventual success.

Working...