vintagedave a day ago

Thing is, he’s right. We learn from all input and a great deal of it is copyrighted — every book or movie we’ve ever seen.

If I tell you something I learned from a copyrighted book, I am not doing something illegal.

If I produce a copy of someone else’s output and state it’s my own, I am. That’s plagiarism. And I think this is the best way to view AIs that create artworks “in the style of” etc — as plagiarism for output, not copyright violation for input.

I, and I am the author of several things including proprietary software (published, sold, even acquired!) and a novel I hope to publish soon, increasingly think copyright is an evil. I like getting income from what I make but I don’t believe in stopping people accessing what I create. Using it to restrict learning absolutely is an evil. And an AI uses it to learn.

  • horsawlarway a day ago

    I also produce almost exclusively IP, and I agree with your take.

    Modern copyright is abusive and evil - we are already at the spot where large corporations entirely control the content, and pretending that copyright is somehow helping indie publishers is a pure fantasy.

    Copyright can exist in a way that I would not find so egregious (and historically - it was much closer to that form when originally proposed). But the modern IP rules are explicitly designed so that large corporations reap all the profit while they pay artists as little as possible.

    If AI cracks that apart.... eh - I don't see it as much of a loss.

    At least for now - plenty of AI models are open weight and publicly available for general use. As long as that trend continues, I don't see the problem.

    • loki-ai 16 hours ago

      so far the best models are controlled by fewer than a dozen labs. access is limited, full of restrictions and guardrails, and most are expensive, especially video models. also, open weight models deliver far inferior results, and hoping they'll catch up is, for now, just hope.

      what kind of breakthrough are you expecting here?

    • lofaszvanitt 21 hours ago

      The problem is, the average human doesn't have the means and the money to use them.

  • wyldfire 19 hours ago

    > But we don't get sued for our influences. Only for what we output.

    Like humans, the model has the capacity for both high-level abstractions that could be described as merely "influence" but also it has the very-specific minute details that when reproduced in art are described as "infringement."

    While I agree that some of what the model learns could/should be described as Fair Use, it's not courts strong suits to create tests for NN graphs to determine just how well abstracted things are. So, to be safe, they're likely to lean towards "just license the content."

    With billions of dollars invested in AI, I wonder if lobbying could influence copyright legislation to tip the copyright duration a little back towards sanity.

    I have a great deal of sympathy for artists who feel like their work is stolen. But I think there's an inevitable future where these tools are powerful and commonplace. So IMO we should focus on compensating artists for their work, especially so if they're used to train ML models.

  • loki-ai 16 hours ago

    I'd expect non tech people clueless about how AI works to miss the gap between humans and machines, but it's baffling how many here fall for it too. Every discussion about AI and copyright spirals into these brain dead examples of false equivalence.

    It's like saying a lone fisherman and a 50m industrial trawler both catch fish, so they should be bound by the same rules and regulations. It's absurdly reductive and pathetic not to notice this error

  • hooverd a day ago

    Sure, but the desired outcome from AI companies here is that laws about piracy only bind the little people.

    • ronsor a day ago

      So? They're wrong too. I don't like copyright and I don't like ladder-pulling AI companies like OpenAI either; what I do like is new technology, as long as it's made open and available.

      It's very possible for neither party to be right and I wish people would acknowledge such situations more often.

  • lofaszvanitt 21 hours ago

    No it is absolute bullshit to compare a specialised computer with a human. Sorry to say this, but JC is missing the point or just full of shit.

    Humans have constraints that are - at the moment - very hard to alter. An average human can/will read at best a few hundred books in its lifetime. You have to have some innate or acquired skill or taste to come up with something that is unique, to select what works you gonna spend your time on, that would make you better at what you do and elevate your work above others or offer something truly unique.

    AI is not human, AI is billions of humans. Let this sink in, alrite?

    No, copyright is not evil. Patents are not evil. Ideas are yours and you are an idiot if you give them away for free. These are your best friends if you know how to wield them.

    Just look at what happened to the internet after Google put its overweight arse on it. And Cloudflare is looming in too with their totally walled garden ecosystem.

    Copyright is evil... suuuure. Then what AI is that expropriates skills of humans, who have put their life to become best in whatever they do. And AI redistributes this to those who have money and power, to the elite.

    • musicale 17 hours ago

      The current life + 70 year (or 95 years after publication for works for hire) copyright regime is excessive, and contrasts with the copyright clause in the constitution (limited time, promoting progress.)

kelseyfrog a day ago

James is on the board[1] of stability.ai, so not exactly an unbiased opinion.

1. https://stability.ai/board-of-directors

  • cthalupa a day ago

    He also got significant financial incentive in protecting copyright from his career so his stance and presence on the board in spite of that perhaps speaks even more to the authenticity of his belief.

  • knowitnone a day ago

    why downvoted for a simple fact with evidence?

charlie-83 a day ago

I think this issue with using these models is something he touches on. As big-hollywood-company-xyz you can ask the writer what were the influences of the work and determine if anything crosses a legal line. You can have a level of trust in what the writer says since their reputation/career is on the line.

With the AI model, it can regurgitate something from the internet word for word and it's on you to check it.

There's a lot of work being done to make AI more transparent but it seems like that has a way to go.

Or you realise that the models are all trained on pirated data anyways and no one cared so if you are a big enough company you can just do whatever you want.

regularjack a day ago

This whole AI is like humans thing is so exhausting. It's obviously not the same thing. A human can't ingest all of the library of Alexandria, and then generate dollars for their employer regurgitating those books.

  • vintagedave 21 hours ago

    Thing is, we do. As a kid I devoured our encyclopedia, dictionary, and many random books of non-fiction on my parents’ shelf. I did as best I could to devour the library.

    I see AIs today who have done that. I am sympathetic to them because I wish I could have been them and achieved what they have. Don’t demean human learning just because we have systems that can learn like us.

    • musicale 17 hours ago

      > I did as best I could to devour the library.

      You'd have to devour the library of congress, and every public scientific paper, and most of the text on the internet. Likely much more than you could read in a lifetime.

      Then you'd have to be able to regenerate a realistic imitation of any book, on demand, in a minute, for a few cents.

  • knowitnone a day ago

    but a human can injest some of the library of Alexandria, and then generate dollars for their employer regurgitating those books. You just mad at the scale in which it is done?

BriggyDwiggs42 15 hours ago

I hate this comparison because it ignores the obvious important difference: humans aren’t owned by somebody. When models aren’t used commercially, they should be compared to people for copyright purposes. Its when their outputs are sold for profit that copyright should be applied to the seller of the outputs, eg openai.

hooverd a day ago

In other words, "big companies should have a spcial boy copyright exception". I expect both the slopification of everything and enforcement of copyright against individuals to continue.

  • linksnapzz a day ago

    Entirely unsurprising take from Cameron; given that he has never considered his theft from Harlan Ellison, Roger Dean or Joan Vinge to be ethically fraught.

    • linksnapzz a day ago

      Keep squealing, downvoters. Cameron's a technically excellent hack who only cares about special effects, which is why his films have maximal revenue and minimal cultural heft.

      • robinson7d a day ago

        I’m not a huge fan of his, and maybe you’re only talking about more recent films, but Terminator and Titanic have more than “minimal cultural heft” as far as Hollywood movies go.

        • linksnapzz a day ago

          The problem with Terminator is that the plot was lifted (per Cameron's admission!) from an Outer Limits script; besides that, the significant visual was casting Arnold...which Dino DiLaurentiis might've done anyway.

        • danielbln a day ago

          Terminator 2, Titanic and I would add Aliens to that list.