Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI EU Government

EU Votes To Ban AI In Biometric Surveillance, Require Disclosure From AI Systems 34

European Union officials have voted in favor of stricter regulations on artificial intelligence, including a ban on AI use in biometric surveillance and a requirement for AI systems like OpenAI's ChatGPT to disclose when content is generated by AI. Ars Technica reports: On Wednesday, European Union officials voted to implement stricter proposed regulations concerning AI, according to Reuters. The updated draft of the "AI Act" law includes a ban on the use of AI in biometric surveillance and requires systems like OpenAI's ChatGPT to reveal when content has been generated by AI. While the draft is still non-binding, it gives a strong indication of how EU regulators are thinking about AI. The new changes to the European Commission's proposed law -- which have not yet been finalized -- intend to shield EU citizens from potential threats linked to machine learning technology.

The new draft of the AI Act includes a provision that would ban companies from scraping biometric data (such as user photos) from social media for facial recognition training purposes. News of firms like Clearview AI using this practice to create facial recognition systems drew severe criticism from privacy advocates in 2020. However, Reuters reports that this rule might be a source of contention with some EU countries who oppose a blanket ban on AI in biometric surveillance. The new EU draft also imposes disclosure and transparency measures on generative AI. Image synthesis services like Midjourney would be required to disclose AI-generated content to help people identify synthesized images. The bill would also require that generative AI companies provide summaries of copyrighted material scraped and utilized in the training of each system. While the publishing industry backs this proposal, according to The New York Times, tech developers argue against its technical feasibility.

Additionally, creators of generative AI systems would be required to implement safeguards to prevent the generation of illegal content, and companies working on "high-risk applications" must assess their potential impact on fundamental rights and the environment. The current draft of the EU law designates AI systems that could influence voters and elections as "high-risk." It also classifies systems used by social media platforms with over 45 million users under the same category, thus encompassing platforms like Meta and Twitter. [...] Experts say that after considerable debate over the new rules among EU member nations, a final version of the AI Act isn't expected until later this year.
This discussion has been archived. No new comments can be posted.

EU Votes To Ban AI In Biometric Surveillance, Require Disclosure From AI Systems

Comments Filter:
  • Artificial Dumbness (Score:5, Interesting)

    by stooo ( 2202012 ) on Friday June 16, 2023 @02:17AM (#63607088) Homepage

    So, we can just rebrand a tech to "Artificial Dumbness", and use it anyway ?
    AD is better fitting anyway.

    • by AmiMoJo ( 196126 ) on Friday June 16, 2023 @04:46AM (#63607226) Homepage Journal

      You can read the act here: https://artificialintelligence... [artificial...enceact.eu]

      You will see that it's not written like a typical law, with specific legal definitions and rules. That's because the way it works in the EU is that individual member states have to craft their own laws, and then the EU monitors them to check that the laws work the way they were intended and achieve the intended goals.

      As such it lays out what the goals are, and what terms success will be judged on.

      Needless to say, such an obvious loophole as simply not calling it AI would be considered a failure, and the country that implemented it that way would need to fix the problem.

    • by ranton ( 36917 )

      They do define what they mean by AI:

      (a) Machine learning approaches, including supervised, unsupervised and reinforcement learning, using a wide variety of methods including deep learning;
      (b) Logic- and knowledge-based approaches, including knowledge representation, inductive (logic) programming, knowledge bases, inference and deductive engines, (symbolic) reasoning and expert systems;
      (c) Statistical approaches, Bayesian estimation, search and optimization methods.

      I think this is pretty dumb, but they do a

    • Preventing AI systems from being trained on data that is publically available is pointless. Every picture on every website and publically accessible social networks will be analyzed or already was analyzed for training purposes, in any case, cannot stop AI from watching TV shows and other mass media. As to EU "protecting" itself from AI, go ahead, prevent legal public AI products (didn't France just put 130 million USD into a startup with no business plan?) People will be using AI no matter what EU decide

  • I have a robot whom I take with me to see an exhibition. It helps him learn to paint.
    But, when he paints something, he needs to disclose copyrighted sources. Does it apply to every picture he ever used to learn?

    It explains why the proposal is wrong on so many levels. When a human learns, he is not a slave to copyright owners of material he used to teach himself. Machine learning should also not be chained to copyright holders (and thus forced to use only free materials to avoid paying hefty royalties).

    • This could go down much like how humans build on each others' creations:

      Say an art lover visits countless museums, reads countless books on various art styles, does some work on its own, and creates a unique artwork. We call that "inspiration". This is fine.

      Now suppose the new work has an uncanny resemblance to a specific existing work. Basically a 1:1 copy with some tweaks. We call this "stealing" or similar terms. For some purposes this is fine (for example if you 'stole' a basic concept, for parody, rese

      • Now suppose the new work has an uncanny resemblance to a specific existing work. Basically a 1:1 copy with some tweaks. We call this "stealing" or similar terms. For some purposes this is fine (for example if you 'stole' a basic concept, for parody, research etc), for other purposes (eg. selling the work for profit) it is not. In case of conflict, a court of law will decide, using intent & other 'yardsticks' as guideline.

        You can't steal an idea. That's why the standard for copyright violation is recognizable [literally copied] elements. A reimagining of the same artwork has to be so similar as to be nearly indistinguishable before it's in violation, basically an obviously deliberate copy.

        • > You can't steal an idea.

          *** start nit picking

          The GP wrote "We call this "stealing" or similar terms."

          Which seems to me like a reasonable statement. Even if he had only written "We call this "stealing" ", I'm going to read the quotes around stealing as meaning on some peculiar definition of stealing and therefore it follows that one can 'steal' an idea.

          end nit picking ***

          • The GP wrote "We call this "stealing" or similar terms."
            Which seems to me like a reasonable statement.

            It might be anywhere but here, where people are expected to know the difference between copyright infringement and theft. This is something we settled long before you discovered Slashdot.

            • > people are expected to know the difference between copyright infringement and theft

              I'm not disagreeing. I think it's pretty clear he knows that when he wrote -- we call this "stealing" or similar terms -- and similarly when he wrote -- if you 'stole' a basic concept.

              And I like it when someone plays with language the way he did. He didn't use some accepted or legal terminology but his comment doesn't need that to make sense. Hence my nit picking too.

  • Infringing copyrights is indeed cheaper and easier than complying to the law. LAION even contains child porn, having a clean dataset is clearly not a priority.
    • Infringing copyrights is indeed cheaper and easier than complying to the law.

      Are you sure? Given the bizarre fines per song that have been the result of download cases, some judge should "set an example" and charge the same fines per byte / instance of training data if it is illegally harvested.

      • by fazig ( 2909523 )
        If the organization that does the infringing is more powerful than the music copyright mafia, then infringing might be cheaper for them.
        If you're just an average Joe infringing on the music copyright mafia, then you're pretty much screwed.

        Here where I live in Germany, we have the GEMA mafia. They have an effective monopoly on the music market in Germany where you if you want your music to be on the radio you have to become a member of that pyramid scheme, where most of the money goes to the higher ups an
      • I don't expect random artists living on the other side of the planet and having their whole portfolio used in an ai model being able to sue a big USA corp. That's why it must be regulated with external audit to check the used dataset and protect rights of small artists. And those rights are more important than having an opensource image generator imo.
    • You can read the article it will help you https://tecnolgya.com/ [tecnolgya.com]
  • Red Flag (Score:4, Funny)

    by flyingfsck ( 986395 ) on Friday June 16, 2023 @05:54AM (#63607318)
    It is strange that the EU doesn’t require a man with a red flag to walk in front of the AI.
  • More legislation to respect and protect the average person passed by the E.U.. Something the U.S. *should* be leading at. I guess Europe has fewer billionaires and less powerful corporations to stop things like this.
    • I guess Europe has fewer billionaires and less powerful corporations to stop things like this.

      Well, yes, this may a be factor. But it may also be that we don't cheer (or criticise) people just by being rich. They are just people after all.

    • by ranton ( 36917 )

      More legislation to respect and protect the average person passed by the E.U.. Something the U.S. *should* be leading at.

      I hope the US does lead in quality legislation regarding AI in society, but this EU legislation is a dumpster fire. I don't want the US to lead in that.

  • Who cares if there's some computer trained to make video surveillance work better? Besides the manufacturer, their competitors and customers that is. Am I to be terrified that a fingerprint scanner will work too well?

    Or are they just worried a computer will point out that the EU is rife with fatal flaws, and people will listen to it instead of the crackpots in charge?

A large number of installed systems work by fiat. That is, they work by being declared to work. -- Anatol Holt

Working...