Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
EU Privacy

EU Reportedly Wants Strict Limits For 'Indiscriminate' Facial Recognition (cnet.com) 24

The European Commission is reportedly considering sweeping reforms to facial recognition regulation to protect citizens from public surveillance, said the Financial Times on Thursday. From a report: Quoting an EU official, the newspaper said new legislation could limit "the indiscriminate use of facial recognition technology." European citizens would be given powers to "know when [facial recognition] data is used." The Commission didn't comment directly on the plans, but a spokesman pointed to a high-level expert group that was set up in June to consider the need for new regulation when it comes to tracking and profiling, including facial recognition.

Discussions around regulating facial recognition technology follow the introduction of a number of public trials around Europe, some of which have been conducted without people knowing they were taking place. The UK's data protection watchdog is investigating the use of the technology to monitor crowds around London's King Cross. Just this week Sweden's national data protection authority imposed a fine of almost 200,000 kronor ($20,700) on a school that trialed the tech to monitor daily attendance of students. According to the EU's General Data Protection Regulation (GDPR), which was introduced last year, this use of the technology breached student privacy rights.

This discussion has been archived. No new comments can be posted.

EU Reportedly Wants Strict Limits For 'Indiscriminate' Facial Recognition

Comments Filter:
  • This is a great start, now all they need to do is put a strict limit on what constitutes 'murder and/or undue injury or damage' with regards to testing self driving.
    • This is a great start, now all they need to do is put a strict limit on what constitutes 'murder and/or undue injury or damage' with regards to testing self driving.

      That's easy enough, make the car makers financially responsible for any damage caused by the self driving vehicles they make. If aircraft manufacturers have to pay up every time some software bug in their autopilot or fly-by-wire system sends a passenger liner crashing into the ground then car manufacturers can pay up too when they fuck up.

      • What about when they kill someone? They're certifying their cars save by putting them on the road. So not predicting a situation would be a premeditated offense.
        • You luddites crack me up. While you were scrawling that a bunch of people got run over or killed in crashes with human drivers and you are so worried about the big bad corporations and their autonomous cars that will save hundreds of thousands of lives someday.
          • "Some day" being the key phrase there, as in there may be unicorns some day. A key requirement to improve on safety is for 98% of the driving population to be able to afford them for one thing, don't see how that will happen. Yes the price of technology comes down, but how cheap will a powerful lidar backed up by radar backed up by cameras ever be?
            • Various people have overpromised self driving cars like its here or only 5 years away. It's probably 10 or 15 years away for really good complete hands off self driving cars, maybe 20 years for it to be standard, but that isn't that long and there will be deaths and missteps along the way.
          • You luddites crack me up. While you were scrawling that a bunch of people got run over or killed in crashes with human drivers and you are so worried about the big bad corporations and their autonomous cars that will save hundreds of thousands of lives someday.

            And those human drivers will be held to account for that. But please do explain to us, why should the owner of a self driving car be punished for the failure of a car manufacturer to create a properly working self driving car? It's not like he's driving the car, the AI is and the driver did not write it, he did not design the sensors and he did not train the AI either. When an aircraft falls out of the sky due to a fly-by-wire system or autopilot failure the manufacturer is liable. Why should that not apply

          • When I run over someone, I end up in court. Care to tell me why this shouldn't be the case if it's a company whose product runs over someone?

            • There isn't going to be any changes... Just more idiots thinking that car manufacturers have their back... Just like they think banks have their back.

              Regardless of if a car drives itself or not, the person in charge of the vehicle is responsible for what it does. If you want to blame the manufacturer, there needs to be a clear indication that it was a defect that was unknown and difficult to counter (A.K.A MCAS)

              It's the same with aircraft. If a pilot puts in the wrong cargo weight and the autopilot c
        • What about when they kill someone? They're certifying their cars save by putting them on the road. So not predicting a situation would be a premeditated offense.

          Premeditation is when you plan a crime out in advance and then implement that plan. A car manufacturer who produces a badly written self driving AI is not planning the murder of his customers. He is basically just doing the same thing as Boeing did when they outsources control software development to unqualified engineers, two airliners crashed into the ground and 338 people died. At worst that is involuntary manslaughter due to criminal negligence. I'm pretty sure whatever their reasons were these Boeing e

  • 20 years ago DMCA made Slashdot angry. In 2019 governments can tell you what you're allowed or not allowed to do on your own computer, and Slashdot sighs with relief. Did we change, the world change, or both? I've changed in plenty of ways, but not on basic civil liberties, such as being allowed to program my computers however I want to.

    How offensive this turns out to be, I suppose depends on what they really mean by "indiscriminate." But if that includes my computer in my house looking out my front door,

    • If I want my system to make sense of who is who, that's my business. It's my agent, thinking my extended thoughts, helping to make up for my feeble brain.

      I think this will be more of an issue as surveillance tech that once was only in the realm of government and large corporations gets cheaper and available to the average user. At some point, it will be possible for someone to cheaply scan faces with a wearable device and ID them; or have their dashcam capture and recognize license plates as examples. While each individual could only capture a small set of the population, as the tech becomes more common place the aggregate data could provide a rather complet

      • At some point, it will be possible for someone to cheaply scan faces with a wearable device and ID them

        Totally! Whose dream isn't that someone faintly familiar walks up and you see their name floating over their head? Right now we're stuck with the awkward thoughts of "Shit, he knows my name but I don't remember his! I should have done that say-their-name-back-to-them trick when we met.. whenever that was. Damn, at least where do I know this guy from?"

        That's not dystopian. A horror show can be built up fro

    • Yeah, how unreasonable. I should be able to take pictures of everyone on the planet and stalk them all over the place if I want to.
    • Talk about straw men. You think it would be more acceptable to a privacy-enforcing country if you put a film camera in the same place? "On a computer" is not the problem here.
    • I've changed in plenty of ways, but not on basic civil liberties, such as being allowed to program my computers however I want to.

      So you're telling us you've created your own facial recognition software?

      • So you're telling us you've created your own facial recognition software?

        No, but I reserve the right. I don't have a gun either (they strike me as useless toys) but if you tell me I can't have one, I object. And if you insist I not have one, then I'm getting one (unless I think you're trying to trick my into getting one as a marketing campaign, heh). And I think having my computers recognize faces would be much less useless than that. Recognizing faces has lots of great applications, both dystopian and comp

    • Governments can tell me what I am allowed to do on my computer.

      I can, though, decide to ignore it and considering that they are usually not able to follow up with a bite to their bark, we're back at square one.

      Once, I fought the government. Until I noticed that it's less hassle to just ignore it.

  • Facial recognition is not a good idea, and easily defeated.

Without life, Biology itself would be impossible.

Working...