Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Courts AI

Artists Claim 'Big' Win In Copyright Suit Fighting AI Image Generators (arstechnica.com) 53

Ars Technica's Ashley Belanger reports: Artists defending a class-action lawsuit are claiming a major win this week in their fight to stop the most sophisticated AI image generators from copying billions of artworks to train AI models and replicate their styles without compensating artists. In an order on Monday, US district judge William Orrick denied key parts of motions to dismiss from Stability AI, Midjourney, Runway AI, and DeviantArt. The court will now allow artists to proceed with discovery on claims that AI image generators relying on Stable Diffusion violate both the Copyright Act and the Lanham Act, which protects artists from commercial misuse of their names and unique styles.

"We won BIG," an artist plaintiff, Karla Ortiz, wrote on X (formerly Twitter), celebrating the order. "Not only do we proceed on our copyright claims," but "this order also means companies who utilize" Stable Diffusion models and LAION-like datasets that scrape artists' works for AI training without permission "could now be liable for copyright infringement violations, amongst other violations." Lawyers for the artists, Joseph Saveri and Matthew Butterick, told Ars that artists suing "consider the Court's order a significant step forward for the case," as "the Court allowed Plaintiffs' core copyright-infringement claims against all four defendants to proceed."

This discussion has been archived. No new comments can be posted.

Artists Claim 'Big' Win In Copyright Suit Fighting AI Image Generators

Comments Filter:
  • This is not a big win in any way shape or form. It merely means their case wasn't summarily dismissed.

    It's fine to have your day in court; but I wouldn't be running around claiming it's a big win. There's still very much a case to lose.

    • Given that this is a second attempt, after the first one was sent back to the "artists", the claim going forward is, indeed, a win, especially if they wrote it themselves. Maybe they got a subscription for a better legal AI this time around instead of using the free chatgpt.

    • by GrahamJ ( 241784 )

      I didn't die today so I win life!

    • The reason they publish this as a big win is twofold:
      1. It is the first time such a case isn't dismissed out of hand.
      2. They won't win, but if they make enough noise they may get a handout.

  • The only people who can claim copyright is those dudes in the Lascaux caves 17,000 years ago.

  • It was literally a ruling to allow the argument, not a WIN FOR the argument. This is what happens when you let morons try to be lawyers.
  • Copyright? (Score:4, Interesting)

    by Hoi Polloi ( 522990 ) on Wednesday August 14, 2024 @10:40PM (#64707398) Journal

    How can you claim copyright over someone or something learning? Do I need to pay someone because I looked at their work and learned from it?

    • Re: (Score:3, Informative)

      by Brain-Fu ( 1274756 )

      A computer is not a person. So, someone looking at art is a different kind of action than training an AI model on it. The equivalence you seem to be drawing is false, and is especially false according to the law.

      So, that's precisely how you can claim copyright infringement when a computer model is trained on art and then replicates it, even if you can't claim copyright infringement when a person looks at art and learns from it.

      I think you already know this and are trying to use a logical fallacy to justif

      • "and is especially false according to the law."

        case dismissed then, because some random dude on Slashdot says so! Logical fallacy my arse.

      • by Anonymous Coward

        If a fucking corporation is a person so is a computer. There isn't anything in the law explicitly denying this.

        Oh wait, they are suing CORPORATIONS, AKA PEOPLE which are doing the learning.

      • A computer is not a person. So, someone looking at art is a different kind of action than training an AI model on it.

        I agree they are different.

        The equivalence you seem to be drawing is false, and is especially false according to the law.

        What is the relevance of the differences according to the law?

        So, that's precisely how you can claim copyright infringement

        You've only established that there are differences between a person and a machine learning. You have not established the relevance of these differences with respect to applicable copyright law.

        when a computer model is trained on art and then replicates it, even if you can't claim copyright infringement when a person looks at art and learns from it.

        If a person looks at copyrighted work and produces an unauthorized derivative of that work they have violated copyright. There is no copyright interest in transformative works regardless of what produces that work. Copyright la

      • Re:Copyright? (Score:4, Insightful)

        by Tora ( 65882 ) on Thursday August 15, 2024 @08:06AM (#64707970)

        Except it doesn't "replicate" it — that argument shows how much you don't understand what is happening in generative AI.

        • It's very simple to show that art created by generative AI isn't being derived from copyrighted inputs - remove them from the training database and show that the output doesn't change.

      • Comment removed based on user account deletion
        • training an AI model on it...not copyright infringement

          This is likely legally incorrect. Some courts have held that the mere act of loading a program into memory is making a copy for copyright purposes. If someone is not entitled to copy a program, then he commits a copyright offense, subject to related legal penalties, each and every time the program is loaded. There is no reason to think that the same reasoning won't apply to all forms of copyrighted material within these courts.

          There is every reason to believe that the owners of these machine learning models

    • Re:Copyright? (Score:5, Informative)

      by Misagon ( 1135 ) on Wednesday August 14, 2024 @11:08PM (#64707428)

      Because training an AI is not the same thing as learning.
      See news posted yesterday [slashdot.org] about training in language-contexts. It is the same for images.

      Human learning uses reasoning capability, using past experiences to interpret to reason about the information received to process it into knowledge. A human that learns exercises herself to gain deeper understanding.

      Training an AI is inserting data into a statistical model. Then you can get statistical data out of it. An AI does not have knowledge, reasoning or understanding about anything in any way similar to how humans do.

      • Because training an AI is not the same thing as learning.

        And learning is not a defined factor in copyright law. Whether AI can actually "learn" or not is actually irrelevant to this case. The process is relevant. Seeing, committing the non-original to memory, and creating a derivation is what this case will ultimately hang on.

      • Because training an AI is not the same thing as learning.
        See news posted yesterday about training in language-contexts. It is the same for images.

        Human learning uses reasoning capability, using past experiences to interpret to reason about the information received to process it into knowledge. A human that learns exercises herself to gain deeper understanding.

        Training an AI is inserting data into a statistical model. Then you can get statistical data out of it. An AI does not have knowledge, reasoning or understanding about anything in any way similar to how humans do.

        What is the relevance of these remarks WRT copyright law? I think everyone here at least understands AI != human. They don't work the same way and they each have different properties. So what?

        If you ask an AI to place the Eiffel tower on a little island surrounded by a clear lake you'll see the shadow cast by the sun on tower in the surrounding landscape. You'll see realistic global illumination and specular reflection of the tower in the water. You'll see realistic plant life and landforms. Nobody pr

    • "Do I need to pay someone because I looked at their work" - I mean, that's the basis for going to the movies, the museum, a concert or buying music...you know, leveraging copyright and all

    • by Luckyo ( 1726890 )

      It's the dream of the copyright industry. They always wanted to lock down learning process, as that would be ultimate rent seeking win. Every person learns almost everything they know from other people. Imagine being granted the right to force payments for every material used in every learning event.

      This is also why no copyright law allows for this sort of abuse. Because it would end human progress as we understand it.

    • The machines aren't learning in any meaningful sense of the word. They're not intelligent. The only honest thing about AI is that it's artificial.

    • It's not the "learning" you have to worry about. It's 1) the unlicensed copying of material before the learning takes place and 2) the massive unlicensed redistribution of the unlicensed material in whole or in part when the chatbot talks with massive numbers of people. And if those people are paying you for access to your chatbot, then you're reselling the unlicensed material in whole or in part which is usually a bigger crime.

      A good way to protect yourself as an AI company is to train your model and nev

    • by jools33 ( 252092 )

      There is a difference between looking at something with your eyes and so memorising a work of art, and instead taking countless numbers of high megapixel images as input feeding them into a large data model and training on all the pixels in all of those images. You can claim copyright infringement at the point where an artwork is accessed without permission and its data is essentially digitally read and then saved. It has been shown that models trained in this way can produce as output sections of the origi

    • In order for the computer to "learn", it first has to make a copy of the artwork; transferring an image file from one computer to another is making a copy in a "fixed, tangible form". A person looking at a piece of art does not involve making a copy.
    • Nice try, but neither are AI "someone", nor are they learning.
  • At the end of this this case, the Judge will list out what is allowed and what is not allowed. Possibly a requirement to combine # or more artistic styles to create a 'new' thing that does not infringe on any single artist.

    The AI companies might lose this original case, but they will then design their next AI around the legal requirements the Judge sets forth.

    So the next time they build an AI it will technically be legal while still screwing over the artists.

    Long term, artists are in major trouble. It has

  • by Powercntrl ( 458442 ) on Wednesday August 14, 2024 @11:18PM (#64707442) Homepage

    Copyright was only intended to apply to the specific instance of a completed work, not an idea or a style. If I want to write a story about teenagers with extraordinary abilities going to a secret school where they learn how hone their special skills, the only court that should run afoul of is the one of public opinion (for lack of originality).

    The fact that AI content generation functions at all is due to the fact most creative endeavors really are just derivative of earlier works.

    • The generated art isn't the copyright infringement. The copies that are made in the process of training the algorithm would be the infringement.
  • "this order also means companies who utilize" Stable Diffusion models and LAION-like datasets that scrape artists' works for AI training without permission "could now be liable for copyright infringement violations, amongst other violations."

    now i can get rich generating a bazillion stable diffusion images in some particular style and signed by me, poluting the web with them and suing the next ai scraper that comes along!

  • As others have noted, the importance of the outcome is exaggerated to give it interest. You could probably fill a newspaper every day with stories about the outcome of some motion to dismiss - I suspect most are dismissed since it means the case has no merit. Actually given the state of most newspapers, you probably fill every newspaper. Its only a big win because the outcome was not certain.The debate over the legal status of AI is not going get settled for a very long while.
  • This is a complex topic so I'll summarize:

    1. Copyright terms have been extended so many times so that the tradeoff between good to society and incentive to artists is now more like "who cares about art, and may the grandkids of the grandkids make money and fuck incentive to artists."

    2. Artists are screwed by everyone. Even YT influencers are screwed by YT. You can't get rich being an artists, and yes Mr. Beast is an outlier and good on him because he does good on others.

    3. Society loses because informatio

  • The problem is, countries like China won't care what these courts will say, and will continue to allow their AI models to train on everything. In the end this will just hinder American / Western AI, and the Chinese will dominate AI going forward (which will NOT be a good thing for these artists who think that they just "won").
  • If I go to art school, study the art of the masters, study every painting, get my degree and start painting my own artwork am I guilty of a copyright violation? If AI go to a digital art school, study the art of the masters, study every painting, start painting its own artwork is it guilty of a copyright violation? This has nothing to do with AI or art but cold hard cash. Someone wants money for using art..
  • But do these artists also compensate other artists which they copy, replicate their styles? And do they also go after other human artists who do it? If not, then they are just big hypocrites.
  • ... DeviantArt.

    DeviantArt isn't an engine, it's a web-site like Facebook or Youtube or Reddit. Most posts aren't "AI generated", so this is artists attacking other artists: Many DeviantArt contributors sell their work using other web-sites. So this attack won't stop whatever the attacking artists think is happening.

Chemistry professors never die, they just fail to react.

Working...