Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't really care how they like it because it's not up to them how I use the tools I want to use. It's literally the same argument photographers faced 100 years ago and in another 100 years I guarantee no one will be talking about AI in the terms you are today.


No one started photographing paintings and declaring them free to use. If they did the lawsuits would leave a huge impact crater.

Photography started displacing painting as a form of portraiture, but displacing a technique is not the same thing as appropriating the work itself.


I don't see any issues with "appropriating" a work especially if it's not a one to one copy which AI does not produce (without out some pretzel level prompting), especially with regards to visual media (what even is appropriation in this case? Your example of photographers taking images of paintings is not the same as how AI training occurs). In other words, training is and should be free and fair use.


> training is and should be free and fair use.

Of course the AI robber barons would that it be so, but it must not be and should not be.

Training gobbles up works in their entirety, verbatim.

Fair use of the verbatim words of a written work requires the excerpt to be small.

Fair use also usually requires attribution, which is missing.

Transformative works like parodies are also fair use, but the LLM isn't transformative int his sense; it's strawman transformative like a meat grinder.

Parodies use the structure of something existing, as a vehicle for original thought which is why they are protected from copyright claims by the authors of whatever is pariodied.


Again, IP is an outdated concept in this day and age. In all honestly there shouldn't even be the notion of fair use, any transformative work should be allowed. There is nothing about LLM training that isn't transformative, just as, well, grinding meat from a steak into stuffed sausages transforms it.

I'm not even talking about big corporations with proprietary models, in fact I oppose their not being open source or weight, I want more open models not fewer as that at least democratizes the value of LLMs. The worst case is having copyright hawks allowing regulatory capture by big AI corps by pushing regulations about licensing content, which, of course, no open model company will be able to afford in the future. I find that infinitely worse than having more lax copyright laws, where only a few corporations can tell you want to think via usage of their LLMs.

Lastly, no one can tell me from first principles why LLM training is bad, on the copyright side, other than, it just is, because copyright law dictates it so. Perhaps copyright law is what needs to be abolished, not LLMs.


"Transformative" has a specific meaning under the fair use doctrine. You can't just Rot13 or gzip someone's novel and call that transformative.

> Perhaps copyright law is what needs to be abolished, not LLMs.

Sure, now that it's inconvenient for some billionaires --- who themselves have nothing to protect, because everything they offer is a service the user can only access through the network, while they have a subscription.


I'm talking about the concept of transformation, not the specific legal language, which, again, I said is not worth discussing, because the legal concept of intellectual property is not useful.

No, not just now, since forever. I suppose Stallman being right all along is about this concept. And just to be clear, I'm not a supporter of current closed source AI companies, like I said I want to see open models succeed.

As I asked above, it really does look like no one can explain why LLM training is bad, besides saying it's bad. Therefore I will continue to reject IP as a concept.


Obviously, since you reject IP, presumably you would be okay to copy and paste code out of some GNU program into your own program, without attribution, and then, if you feel like it, release that program under the least restrictive terms possible (as close to the public domain as you could practically get away with).

So discussions revolving about doing so less directly through training a model just add distracting details that don't matter.

If everyone did that (due to there not being any rules against that), then fewer people would write programs under free licenses. Many such developers are volunteers, whose only payment is that the work product is theirs to license how they want.

Having that taken away from us is discouraging.

We haven't done anything to deserve such a "fuck you".


Even today, in 2026, it is possible to use photography in ways that infringe copyright! You literally cannot just snap your shutter over anything whatsoever and call it yours!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: