Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Databases

Vending Machine Error Reveals Secret Face Image Database of College Students (arstechnica.com) 100

Ashley Belanger reports via Ars Technica: Canada-based University of Waterloo is racing to remove M&M-branded smart vending machines from campus after outraged students discovered the machines were covertly collecting facial-recognition data without their consent. The scandal started when a student using the alias SquidKid47 posted an image on Reddit showing a campus vending machine error message, "Invenda.Vending.FacialRecognitionApp.exe," displayed after the machine failed to launch a facial recognition application that nobody expected to be part of the process of using a vending machine. "Hey, so why do the stupid M&M machines have facial recognition?" SquidKid47 pondered. The Reddit post sparked an investigation from a fourth-year student named River Stanley, who was writing for a university publication called MathNEWS. [...]

MathNEWS' investigation tracked down responses from companies responsible for smart vending machines on the University of Waterloo's campus. Adaria Vending Services told MathNEWS that "what's most important to understand is that the machines do not take or store any photos or images, and an individual person cannot be identified using the technology in the machines. The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface -- never taking or storing images of customers." According to Adaria and Invenda, students shouldn't worry about data privacy because the vending machines are "fully compliant" with the world's toughest data privacy law, the European Union's General Data Protection Regulation (GDPR). "These machines are fully GDPR compliant and are in use in many facilities across North America," Adaria's statement said. "At the University of Waterloo, Adaria manages last mile fulfillment services -- we handle restocking and logistics for the snack vending machines. Adaria does not collect any data about its users and does not have any access to identify users of these M&M vending machines." [...]

But University of Waterloo students like Stanley now question Invenda's "commitment to transparency" in North American markets, especially since the company is seemingly openly violating Canadian privacy law, Stanley told CTV News. On Reddit, while some students joked that SquidKid47's face "crashed" the machine, others asked if "any pre-law students wanna start up a class-action lawsuit?" One commenter summed up students' frustration by typing in all caps, "I HATE THESE MACHINES! I HATE THESE MACHINES! I HATE THESE MACHINES!"

This discussion has been archived. No new comments can be posted.

Vending Machine Error Reveals Secret Face Image Database of College Students

Comments Filter:
  • Proof? (Score:2, Interesting)

    by bjoast ( 1310293 )
    Is there any proof at all that this "secret database" of face images actually exists?
    • Re:Proof? (Score:5, Interesting)

      by Njovich ( 553857 ) on Saturday February 24, 2024 @06:38AM (#64265042)

      Who is talking about a database of images? Apparently they collect 'demographics' (age/gender/etc) and unless they have found some new amazing tech they are now lying about 'not taking pictures'. They say they are GDPR compliant, but not being forthcoming about what data they collect does not bode well for them in interactions with authorities.

      The whole thing is an advertising platform. They want to display ads on their vending machines. If they have data of impressions and demographics they make more money.

      So, a lot of outdoor advertisers use cameras to detect impressions and demographics. Some of them got into trouble with GDPR. The problem is that if they have timestamped info (even if just locally) along with this demographic data plus the location of the device, plus the sales data that they collect too... in a public space or a private space that did not inform people of this data collection... they are going to have a tough time explaining how that is not PII and where people consented to this collection.

      The first thing to explain is why they have a hidden camera on their device and how they manage to get data from it without using images.

      • Re:Proof? (Score:5, Insightful)

        by bjoast ( 1310293 ) on Saturday February 24, 2024 @07:07AM (#64265068)

        Who is talking about a database of images?

        Only the featured story. It's in the title. You should check it out.

        • by Njovich ( 553857 )

          Oh I just read the article and not the title. That's amazing that ars technica mentions that in the title but nowhere in the article. Can't wait for so called 'journalists' to be replaced by AI

          • by znrt ( 2424692 )

            what has journalism to do with ars technica?

            read the student's publication, it has some actual informative rigor. there isn't a "face" database. just a database with data derived from faces. kinda makes sense, but ofc all this according to statements from the providers implied, i don't think anyone has properly audited these machines (yet).

            https://mathnews.uwaterloo.ca/... [uwaterloo.ca]

            • >read the student's publication, it has some actual informative rigor.

              Having been published in MathNews, it could have been silly.. but for anything serious I'd pretty much have expected them to take the machine apart and reverse engineer the chips.

              You do NOT mess with young IT geeks trying to prove to themselves and the world how clever they are, especially when they have a lot of time on their hands.

          • by Anonymous Coward

            Oh I just read the article and not the title. That's amazing that ars technica mentions that in the title but nowhere in the article. Can't wait for so called 'journalists' to be replaced by AI

            Journalists write the article. Editors write the title. That leads to these annoying problems where the editor tries to sex up the title without fully comprehending the article. So your complaint is with journalism, but not specifically 'journalists'.

          • by Askmum ( 1038780 )
            You read the article first? You read all articles? Or how do you determine which article to read? Most people do that by reading the title. That's what clickbait is for!
      • >

        So, a lot of outdoor advertisers use cameras to detect impressions and demographics. Some of them got into trouble with GDPR. The problem is that if they have timestamped info (even if just locally) along with this demographic data plus the location of the device, plus the sales data that they collect too... in a public space or a private space that did not inform people of this data collection... they are going to have a tough time explaining how that is not PII and where people consented to this collection.

        I don't think GDPR ever even enters their minds. These greedy bozos just see a 'business opportunity' and feel entitled to conduct unrestrained gathering of data on the general public for the purpose of enriching themselves. GDPR is something that jumps up and bites them in the balls because in their fit of greed they failed to think more than one step ahead. The original poster got modded down but that does not make him/her wrong. The level and sophistication of data scraping and monitoring of the general

        • >

          So, a lot of outdoor advertisers use cameras to detect impressions and demographics. Some of them got into trouble with GDPR. The problem is that if they have timestamped info (even if just locally) along with this demographic data plus the location of the device, plus the sales data that they collect too... in a public space or a private space that did not inform people of this data collection... they are going to have a tough time explaining how that is not PII and where people consented to this collection.

          I don't think GDPR ever even enters their minds. These greedy bozos just see a 'business opportunity' and feel entitled to conduct unrestrained gathering of data on the general public for the purpose of enriching themselves.

          Even worse when you consider the largest companies knowingly perform highly illegal capture of data, having already calculated that getting caught and paying “fines” STILL makes it worth the illegal activity, and basically encourages bad behavior. Every time.

          When the moral compass is that far fucked, it won’t be long before the activity becomes grossly criminal for profit. They can get away with anything. Society’s apathy is deadly.

      • They claim they don't collect data. Perhaps they are collecting metadata about the users. Metadata isn't data, so that's fine.
      • Re:Proof? (Score:4, Interesting)

        by JBMcB ( 73720 ) on Saturday February 24, 2024 @10:11AM (#64265298)

        Who is talking about a database of images? Apparently they collect 'demographics' (age/gender/etc) and unless they have found some new amazing tech they are now lying about 'not taking pictures'.

        The amazing new technology is OpenCV and it's been around for quite a while:

        https://python.plainenglish.io... [plainenglish.io]

        And, no, it doesn't need to store pictures to analyze them.

        • by Njovich ( 553857 )

          Bit of a strawman as I didn't use the word 'store' anywhere. You are arguing against something I didn't say. They explicitly differentiate between taking and storing images and say they do neither. I have used OpenCV many times and as far as I'm aware you need to take a picture to analyze it with software like OpenCV. Have you managed to analyze pictures with opencv without taking the image data?

          • I think there's a fair distinction between "taking a photo/picture" and processing a video stream in real-time.

            • by Njovich ( 553857 )

              I actually said image, not photo. They explicitly distinguish between image and photo too: "what's most important to understand is that the machines do not take or store any photos or images"

              You can argue that a video feed like one processes with opencv is not a sequence of images, I guess? I would disagree but you could. Still would make their statement misleading at least.

      • They say they are GDPR compliant, but not being forthcoming about what data they collect does not bode well for them in interactions with authorities.

        And they say that while standing in Canada (outside of GDPR jurisdiction). It's a poor excuse. Every kind of biometric data needs explicit consent with GDPR. My colleagues at work had to design a consent form and collect paper signatures to test their new smartwatch invention (collects some sort of signal from skin), even though there is no way to recognize a person through the data. It's still biometrics. For any collection of data ones needs to explain the purpose, the duration of the database, the name o

        • Every kind of biometric data needs explicit consent with GDPR.

          No, it depends on what you do with it. If you analyse real-time data for non-biometric related things it doesn't qualify under the GDPR. If someone is capturing and storing facial images, or assigning metadata with timestamp and storing it, that would be a GDPR violation. If someone has a camera and the image goes through a facial recognition software purely to identify if it is a face, and never stores any information then that isn't "biometric data".

          You don't need to collect and store data in order to tri

      • by gweihir ( 88907 )

        Age/gender/etc. is already illegal to collect under the GDPR without informed consent. They would have to cordon these machines off and post warning-signs.

        • by Bongo ( 13261 )

          Consent isn't even a good one to use, as it must be freely given. What if you just want to buy a drink without giving your PII away?

          • by gweihir ( 88907 )

            Indeed. I am not quite sure about the legal situation, but since the PII is not required for selling the goods, they may be required to sell without giving PII as an alternative. I.e. they can ask for that face, but if the person denies it, they would still have to sell without recording anything and they would have to do it at the same price.

        • Age/gender/etc. is already illegal to collect under the GDPR without informed consent. They would have to cordon these machines off and post warning-signs.

          No they wouldn't. These machines don't collect any data. That is the GP not realising TFA is talking about two different things. The machine that was collecting data in shopping malls was never claimed to be GDPR compliant.

          • by gweihir ( 88907 )

            So the other machine does recognition of "a face" without collecting or processing data? That would be quite a feat and well beyond what modern IT can do.

            There is a way out though: The machine would not be GDPR compliant, but a careful formal GDPR impact evaluation could find that there was no significant privacy impact. That then would mean while the machine is not compliant, it would not be needed to be brought into compliance. Of course a court could then find at any time this evaluation is not enough an

      • Who is talking about a database of images? Apparently they collect 'demographics' (age/gender/etc) and unless they have found some new amazing tech they are now lying about..

        If they can accurately identify someone’s gender without even hearing a single pronoun, then yeah. I’d say they’re hiding some amazing tech.

        Sure, you could do that, but you wouldn’t dare try.

      • The answer seems simple. Q: Are these machines connected to the Internet?
        If Yes, then EVIL ensues.
        If No, then More Questions.
        • by madbrain ( 11432 )

          The data could be stored locally retrieved later by the vending machine servicing company, and then nefariously used.

    • The article mixes in a known violation with an unknown of the vending machines. Though the vending machine provider does admit to storing estimated age and gender statistics.
    • by sjames ( 1099 )

      The explanation that they just have it recognize the presence of any face rather than a particular face MIGHT be true, but there have been so many blatant lies by so many companies (and police departments) around this sort of thing, nobody is willing to trust that claim.

      In the same sense that it is possible that the guy on the street corner actually just needs some cash and is willing to give you a good deal on a quality watch he has in order to get it.

      One reason why companies intending to behave honestly a

    • Missing the point. (Score:5, Informative)

      by Gravis Zero ( 934156 ) on Saturday February 24, 2024 @10:28AM (#64265312)

      The explicit purpose of facial recognition is so that you do NOT have to keep images. Instead you process the image into a smaller set of data that can identify individuals. Processing this data further requires sophisticated software that a company doesn't want exposed (other companies could buy a machine and copy it) so it's going to be done via their "cloud services". This means the machine is sending the facial recognition data, without the consent of the user, back to the company which is a very clear violation of the GDPR.

      Their own site reads that owners have a lot of control of how these machines operate: https://www.invendagroup.com/i... [invendagroup.com]

      No more wasting resources on machine visits. Connect all your devices under one platform and rest easy knowing what's happening with them 24/7. Enjoy the convenience of executing remote commands in real time, with an abundance of data at your disposal. You'll never run a tighter fleet.

      What really needs to be done here is a full autopsy of the vending machines' command set and their communications to the Invenda "service cloud".

    • by Himmy32 ( 650060 )

      Searched the vendor's site, on their main page they advertised the features:

      Your business decisions should never feel like a shot in the dark. Optimize your sales and operations thanks to comprehensive, real-time data about consumer behavior patterns, transactions and machine health.

      Drive brand awareness and encourage sales by running advertisements directly at the POS. Get additional revenue by integrating vending machine screens with programmatic advertising platforms.

      Our IVM is a slick device with a large 49-inch touchscreen certain to attract consumers. The machine comes with a “brain” – Invenda OS – and is connected to the Invenda Cloud, which allows you to manage it remotely and gather valuable environmental, consumer and transactional data. The device can be branded according to your requirements to further enhance your brand presence.

      Energy-saving and revenue-increasing, this model employs the best of AI, IoT, and cloud technology to provide you with sales opportunities and ease of management never before seen in the automated retail industry.

    • by kriston ( 7886 )

      The vendor says they don't store any data and only use the information to activate the screen.

      Why does Arse Technica say it has a database?

  • by ls671 ( 1122017 ) on Saturday February 24, 2024 @06:13AM (#64265030) Homepage

    I still wear my almost full-face covid mask when I go out, perfect alibi to defeat facial recognition /s

    • I suggest you stay inside. This would be even more productive in defeating facial recognition.
    • by test321 ( 8891681 ) on Saturday February 24, 2024 @08:38AM (#64265162)

      They collect demographics. The covid mask get you binned into "person with a covid face mask" population and get served health-related advertisements. You still get reliably recognized as a category, they still show you ads that are relevant for your expected concerns, and they get paid for the ads. They win.

      I personally buy drinks and sweets from the machines that only take coins (which are not connected; I refuse to install an app to pay for a chocolate bar). One option with the connected machines is to poison the well; wear pictures taken from National Geographic paper editions, or a tablet showing faces of random ages and ethnicities. Let's show them how successful the new M&M flavour is with Maasai warriors in tribal face paint and Inuit elders in caribou skin parka.

      • Any information you give narrows the field. This includes the [x] Do Not Track

        The only way to poison the well is to start presenting randomized information. Today you got that mask on, tomorrow you don't, and on thursday you wear big ears.
      • Why so much effort? Slap a piece of tape over the camera.

  • University of Waterloo might throw the M&M machines out if the source code wasn't written in Fortran...
  • by Nrrqshrr ( 1879148 ) on Saturday February 24, 2024 @07:29AM (#64265080)

    .exe, huh? Wouldn't have that problem if it was on linux.

  • And if I encounter a vending machine with any hith tech built in I will look for a camera lense and if u find one I will stick a blob if chewed gum to it, some colored fingernail polish will work too, something difficult to clean off, how about epoxy glue
    • by Misagon ( 1135 )

      If the lens is made of plastic then superglue would make the plastic fog up in addition to sticking to it.

  • If, as the company claims no images are being stored or used, that it's only to detect motion, could someone put on a halloween mask and still have the machine work? There's still a face to use for motion, just not a human face.

    If someone at the University could please try this out and let us know, that would be great.

  • ... though of course it's just the manufacturers word for it.

    I mean it's possible that it just 1. takes an image every X seconds, 2. recognizes if there is a face in it or not, and then 3. discards it.

    It's also possible that it decides that John Doe likes peanuts vs. plain, and keeps a massive database somewhere.

  • So their vending machines run Windows...

    • That's quite an assumption.

      I just made a "note to self" to always name my executables something inappropriate.

      globalThermoNuclearWar.exe for the s/w that controls the lights.

      • Step 1...

      • by jmccue ( 834797 )

        Well I think that is a very fair assumption :) If on Linux, I doubt you would see that error of even if such an error would occur.

        Now, based upon that error, looks like the images are processed locally and results are stored on something like a flash drive. Maybe that flash drive is replaced and brought to the office when the machine is replenished.

  • by iAmWaySmarterThanYou ( 10095012 ) on Saturday February 24, 2024 @09:58AM (#64265286)

    Eom.

  • It recognizes whether it's a face or not, nothing more or less. Absolutely nothing to see here.
  • by gweihir ( 88907 ) on Saturday February 24, 2024 @11:05AM (#64265358)

    Obviously, such a system can _never_ be GDPR compliant. Facial data is among the most critical data with respect to the GDPR (biometrics). That means it must not, under any circumstances, be recorded, stored or processed without explicite informed (!) consent and that consent must be given in advance. Oh, and that consent can be withdrawn and if the person recorded is underage, that consent can be withdrawn at any time and without a reason and without limit, and then all data pertaining to that person must be irretrievably destroyed. Hence any claims to "GDPR compliance" of this abomination are simply a direct lie.

    Incidentally, if such a machine ever makes it to Germany, it may well get classified as a "covert surveillance device" and then mere possession becomes illegal. Camouflaged microphones or cameras are illegal to possess in Germany.

    • by bsolar ( 1176767 )

      Obviously, such a system can _never_ be GDPR compliant. Facial data is among the most critical data with respect to the GDPR (biometrics). That means it must not, under any circumstances, be recorded, stored or processed without explicite informed (!) consent and that consent must be given in advance.

      This is the definition of "biometric data" according to the GDPR (emphasis mine):

      biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data;

      The functionality they claim is that the machine can identify when a face is before the machine and a customer interaction is to be expected, not that the machine can identify which individual is in front of the machine based on their face. Only the latter requires biometric data.

      Now of course it's possible they are not being truthful, but that's another matter.

      • by gweihir ( 88907 )

        Well. They would have to blur the image enough to qualify as anonymization regarding the GDPR. Note that there is no requirement to actually do the identification. As soon as it is possible with the data collected, it is PII.
         

        • by bsolar ( 1176767 )

          Well. They would have to blur the image enough to qualify as anonymization regarding the GDPR. Note that there is no requirement to actually do the identification. As soon as it is possible with the data collected, it is PII.

          Merely recognizing that "a face" is before the machine does not require any data collection of any kind, let alone biometric data.

          Identifying a specific individual from a face does require the biometric data for the face of that individual to be stored somewhere, but that's not the capability the manufacturer claims the machine has.

          • by gweihir ( 88907 )

            Yes it does. Just making a picture already qualifies. Also, data collection is a requirement for data processing. And data processing is subject to even stricter regulation. That the date gets deleted immediately after processing does not reduce the requirements for its collection. Or are you suggesting that recognizing "a face" does not require any data or processing of data? Unless that that is fundamentally unsuitable to recognize individuals, it is PII under the GDPR.

            • by bsolar ( 1176767 )

              Yes it does. Just making a picture already qualifies. Also, data collection is a requirement for data processing. And data processing is subject to even stricter regulation. That the date gets deleted immediately after processing does not reduce the requirements for its collection. Or are you suggesting that recognizing "a face" does not require any data or processing of data? Unless that that is fundamentally unsuitable to recognize individuals, it is PII under the GDPR.

              Maybe I'm misunderstanding your argument. You previously stated:

              Facial data is among the most critical data with respect to the GDPR (biometrics)

              This is correct, but does not apply to the machine in the article if it functions as the manufacturer stated. The GDPR makes that very clear in the definition of "biometrics" I quoted above. If there is no "specific technical processing" being used to extract data able to identify an individual, there is no "biometrics data" at play. Explicitly, from the GDPR:

              The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person.

              What I stand corrected about is that the photo itself is "personal data" and that mere

              • by gweihir ( 88907 )

                Maybe. "Facial data" is, for example, a picture of a face, but can also be a biometric data-set for a face. The potential of turning the first into the second is enough to make the first PII. Hence while your quoted definition is correct, it misses the part where data that can be turned into biometric data has the same restrictions and protection and it does not matter whether the machine can do it, it is enough if "anybody" can do it. For simplicity, precursor data to biometric data is often referred to as

                • by bsolar ( 1176767 )

                  Hence while your quoted definition is correct, it misses the part where data that can be turned into biometric data has the same restrictions and protection and it does not matter whether the machine can do it, it is enough if "anybody" can do it. For simplicity, precursor data to biometric data is often referred to as "biometric data" when talking about the GDPR, even is not strictly correct.

                  The GDPR is crystal clear in the part I quoted above, which I will quote again with emphasis mine:

                  The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person.

                  So a photo of a person is "personal data" in any case, but falls under "special categories of personal data" only if it's actually processed to extract biometric data.

                  Processing a photo of a person without extracting biometric data is explicitly allowed without it falling under the more restrictive "special categories" rules.

                  • by gweihir ( 88907 )

                    You still do not get it. You quote things, but you miss things. Your new quote does apply to processing limitations, not to limitations of getting data. It also partially defines biometric data, not PII. Data becomes PII for example because it _could_ be processed into biometric data. It becomes biometric data if that processing happens. The difference is that if you process or plan to process PII into biometric data, you need to add that when getting informed consent. If you did not give that information w

                    • by bsolar ( 1176767 )

                      Seriously, get a clue. I am doing GDPR _audits_ in a regulated environment. And I will now stop responding, because you are not arguing in good faith. What you are doing is that you selected your opinion and then you ignore everything that disagrees with it. That is not rational.

                      I don't care about your credentials, only about your arguments. I did admit when I was wrong before and stated so and I would have no issue in admitting to be wrong again if convinced of that.

                      I think your original argument was simply incorrect. Your new argument I disagree with but I can see being convinced otherwise.

                      Said that, if you don't wish to argue further It's of course fine.

    • >Obviously, such a system can _never_ be GDPR compliant.

      that depends upon what's really happening here.

      As near as I can tell, we have the name of a program, a statement from the country, and a to of conjecture, upon which is filed more conjecture.

      If the thing is actually building a database and storing multitudes of faces, then, yes, there is no way it could be compliant of much of anything, or even non-nefarious.

      But if it's simply a (supposedly) bright idea to recognize that there *is* a person in front

  • by packrat0x ( 798359 ) on Saturday February 24, 2024 @11:15AM (#64265376)

    Wait until the prices change, based on the customer's "image".

  • by liquidpele ( 6360126 ) on Saturday February 24, 2024 @11:47AM (#64265442)
    This is so stupid. These students probably use their credit card on the machine too but don't worry about that, and carry a smartphone but don't worry about that, etc. This is so dumb I can only imagine it's like 3 students interviewed and it's just clickbait BS by the news orgs.
  • jersey jack pinball may be in trouble but default settings may allow player to disable it in game. But in attract mode?

  • by Murdoch5 ( 1563847 ) on Saturday February 24, 2024 @12:36PM (#64265500) Homepage
    Instead of making a baseless claim, why not dump all the firmware and software stacks, license them open source, and provide build keys. Then provide a way for a user to verify the build on the vending machine matches the privacy respecting build, and you're home free. Why trust the claim the vending machine is behaving? Why does the vending machine have to be "smart", you can make it dumb and still do some kind of on / off recognition.

    I would demand the machines confiscated, and have all internal storage, of any kind removed and destroyed by an NIST-800-88 compliant service, then get a warrant to search and destroy any and all biometrics storage systems at the manufacture. If they're not storing any data that's a violation of PiPeda, then they have absolutely nothing to worry about, and will, with glee, open and welcome the search.
    • You want all intrernal storage of any kind removed and destroyed?

      So the inventory of product in the machine, the price of a purchase, the menu of available items, all of that should be destroyed?

      What's left? A nice black screen and a pile of stuff behind it that can't be sold?

      • Everything destroyed, otherwise how are you going to make sure any stored information is gone? How are you going to prove that no biometric data has remained in the machines, and if they're taking any care in the collection it's not stored as pictures, it's stored heavily encoded. That means the menu data, might have the biometric data encoded into metadata for the menu. This a serious, and major privacy concern, and if U of W is being serious about it, they have to take intense measures to protect stud
  • ... to see who is eating all the green M&Ms.

  • dont.name.controversial_software.descriptively.exe

  • ...vending machine DON'T need facial recognition. ...nor do they need AI, btw

    Vending machines have existed since (someone look it up) about 100 years.
    I wonder how they did it?
  • With tape. If that doesn't work use spray paint.
  • It is the same with Microsoft Purview. M$ claims that it is GDPR (or in Germany DSGVO) compliant. This is a blatant lie, as it is not possible. in order to receive a message that is encrypted using purview, you are required to login to the MS infrastructure otherwise you cannot get the message. it is the same here. this vending machine CAN NOT be GDPR compliant. They are lying! GDPR ompliance requires the user to be informed about all steps of any personal data being processed. As face biometric data is per
  • ... Image Database of College Students

    Where's this database of images, the headline declared exists? This is political click-bait: People should worry much more about the their phone recording their lives (usually with the help of the owner), their credit-card card recording their lives (well, the parts of it that cost money).

    If students really care about privacy and political machinations, they should worry about the absence of privacy laws, the lack of truth in US advertising and US campaign money buying votes.

  • If you don't like it, don't use them.
    I spent my student and work life making my lunch myself
    and being in control of my nutrition.

  • Headline says a database from the vending machine FR scanner has been revealed. But TFA says nothing about any database whatsoever (from the vending machines). ?

  • Rather than speculating on things like "But where is the database of images? How would they even use it?", instead ask why they would spend real money to implement facial recognition in a vending machine? The statements in the article makes it sound like they somehow want to be responsive to whether an actual person is present, which sounds good, but couldn't possibly save more than a couple bucks a year in power usage from running fulltime, or a couple bucks in capital costs installing a "Start transacti

  • Immediately unplug those machines so no network foolery can happen.
    The company can "donate" a couple to non destructive testing by the eff.
    I would expect they have the expertise to plumb the memory in the systems to find hidden stored images or suspicious gigabytes hashed to be unreadable.

A computer scientist is someone who fixes things that aren't broken.

Working...