Those claiming AI training on copyrighted works is “theft” misunderstand key aspects of copyright law and AI technology. Copyright protects specific expressions of ideas, not the ideas themselves. When AI systems ingest copyrighted works, they’re extracting general patterns and concepts - the “Bob Dylan-ness” or “Hemingway-ness” - not copying specific text or images.

This process is akin to how humans learn by reading widely and absorbing styles and techniques, rather than memorizing and reproducing exact passages. The AI discards the original text, keeping only abstract representations in “vector space”. When generating new content, the AI isn’t recreating copyrighted works, but producing new expressions inspired by the concepts it’s learned.

This is fundamentally different from copying a book or song. It’s more like the long-standing artistic tradition of being influenced by others’ work. The law has always recognized that ideas themselves can’t be owned - only particular expressions of them.

Moreover, there’s precedent for this kind of use being considered “transformative” and thus fair use. The Google Books project, which scanned millions of books to create a searchable index, was ruled legal despite protests from authors and publishers. AI training is arguably even more transformative.

While it’s understandable that creators feel uneasy about this new technology, labeling it “theft” is both legally and technically inaccurate. We may need new ways to support and compensate creators in the AI age, but that doesn’t make the current use of copyrighted works for AI training illegal or unethical.

For those interested, this argument is nicely laid out by Damien Riehl in FLOSS Weekly episode 744. https://twit.tv/shows/floss-weekly/episodes/744

  • FatCrab@lemmy.one
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 months ago

    Like I’ve said, you are arguing this into nuanced aspects of copyright law that are absolutely not basic, but I do not agree at all with your assessment of the initial reproduction of the image in a computer’s memory. First, to be clear, what you are arguing is that images on a website are licensed to the host to be reproduced for non-commercial purposes only and that such downstream access may only be non-commercial (defined very broadly–there is absolutely a strong argument here that commercial activity in this situation means direct commercial use of the reproduction; for example, you wouldn’t say that a user who gets paid to look at images is commercially using the accessed images) or it violates the license. Now, even ignoring my parentheses, there are contract law and copyright law issues with this. Again, using thumbs and, honestly, I’m not trying to write a legal brief as a result of a random reply on lemmy, but the crux is that it is questionable whether you can enforce licensing terms that are presented to a licensee AFTER you enable, if not force, them to perform the act of copying your work. Effectively, you allowed them to make a copy of the work, and then you are trying to say "actually, you can only do x, y, and z with that particular copy–and this is also where exhaustion rears its head when you add on your position that once a trained model switches from non-commercial deployment to commercial deployment it can suddenly retroactively recharacterize the initial use as unlicensed infringement. Logistically, it just doesn’t make sense either (for example, what happens when a further downstream user commercializes the model? Does that percolate back to recharacterize the original use? What about downstream from that? How deep into a toolchain history do you need to go to break time traveling egregious breach of exhaustion?) so I have a hard time accepting it.

    Now, in response to your query wrt my edit, my point was that infringement happens when you do the further downstream reproduction of the image. When you print a unicorn on a t-shirt, it’s that printing that is the infringement. The commercial aspect has absolutely no bearing on whether an infringement occurs. It is relevant to damages and the fair use affirmative defense. The sole query of whether infringement has occurred is whether a copy has been made and thus violated the copyright.

    And all this is just about whether there is even a copying at the training of the models stage. This doesn’t get into a fairly challenging fair use analysis (going by SCotUS’ reasoning on copyrightability of API in Oracle v Google, I actually think the fair use defense is very strong, but I also don’t think there is an infringement happening to even necessitate such an analysis so ymmv–also, that decision was terrible and literally every time the SCotUS has touched IP issues, it has made the law wildly worse and more expensive and time-consuming to deal with). It also doesn’t get into whether outputs that are very similar to works infringe in the way music does (even though there is no actual copying–I think it highly likely it is an infringement). It also also doesn’t get into how outputs might infringe even though there is no IP rights in the outputs of a generative architecture (this probably is more a weird academic issue but I like it nonetheless). Oh, and likeness rights haven’t made their way into the discussion (and the incredible weirdness of a class action that includes right of publicity among its claims).

    We can, and probably will, disagree on how IP law works here. That’s cool. I’m not trying to litigate it on lemmy. My point in my replies at this point is just to show that it is not “basic copyright law bruh”. The copyright law, and all the IP law really, around generative AI techniques is fairly complicated and nuanced. It’s totally reasonable to hold the position that our current IP laws do not really address this the way most seem to want it to. In fact, most other IP attorneys I’ve talked to with an understanding of the technical processes at hand seem to agree. And, again, I don’t think that further assetizing intangibles into a “right to extract machine learning from” is a viable path forward in the mid and long run, nor one that benefits anyone but highly monied corporate actors either.