Hacker Newsnew | past | comments | ask | show | jobs | submit | muldvarp's commentslogin

> and I explicitly do not want it used to train AI in any fashion

Then don't release it. There is no license that can prevent your code from becoming training data even under the naive assumption that someone collecting training data would care about the license at all.


Do you define "AI slop" as "easily identifiable as AI"?

> What's the end game?

What makes you think that there is an "end game"?

Someone figured out how to make computers be able to create content that is costly to distinguish from human-made content. Someone else is using it to pump out AI slop in the hopes that it will make them a quick buck. The platform becomes unusable for anyone that values their own sanity. No "end game" to be found.

AI will be the worst thing that happened to society in a very long time.


IDK about the worst thing.

To flip it around AI can help a doctor find a cure, the other day I used AI to help translate and assist a non English speaker find the right bus.

AI is just a tool. It's up to us to determine how it's used.

I know I'm not using any service or product that's so lazy as to use AI sloop to advertise. Especially if it's clearly AI with jumbled text.


> Unfortunately as I see it, even if you want to contribute to open source out of a pure passion or enjoyment, they don't respect the licenses that are consumed.

Because it is "transformative" and therefore "fair" use.


Running things through lossy compression is transformative?

The quotation marks indicate that _I_ don't think it is. Especially given that modern deep learning is over-paramaterized to the point that it interpolates training examples.

Fair use is an exception to copyright, but a license agreement can go far beyond copyright protections. There is no fair use exception to breach of contract.

I imagine a license agreement would only apply to using the software, not merely reading the code (which is what AI training claims to do under fair use).

As an analogy, you can’t enforce a “license” that anyone that opens your GitHub repo and looks at any .cpp file owes you $1,000,000.


This to me is the most ridiculous thing about the whole AI situation. Piracy is now apparently just okay as long as you do it on an industrial scale and with the expressed intention of hurting the economic prospects of the authors of the pirated work.

Seems completely ridiculous when compared to the trouble I was in that one time I pirated a single book that I was unable to purchase.


We've essentially given up on pretending that corporations are also held accountable for their crimes in the recent years, and I think that's more worrying than anything.

Hollywood and media publishers run entire franchises of legal bullies across developed world to harass individuals, and lobby for laws allowing easy prosecution of ISP contract owner. Even Google Books was castrated because of IP rights. Now I have hard time to imagine how this IP+AI cartel operates. Nowadays everyone and their cat throws millions on AI so I imagine IP owners get their share.

Recently archive.org got into trouble for renting one book (or fixed amount of books) exclusively on the whole world, like in a library. Sad men from law office came and made an example of them, but it seems that if they used those books to teach AI and serve the content in "remembered" way, they would get away with it.

> Seems completely ridiculous when compared to the trouble I was in that one time I pirated a single book that I was unable to purchase.

How would one manage to get in trouble for pirating a book? Unless you mean with your employer for doing it on their network or something?


Well, so what the actual ruling was was that use of the books was okay, but only if they were legally obtained. And so the authors could proceed with a lawsuit for illegally downloading the books. But then presumably compensation for torrenting the books was included as part of the out of court settlement. So the lesson is something like AI is fine, but torrenting books is still not acceptable, m'kay wink wink.

But the entire promise of AI is that things that were expensive because they required human labor are now cheap.

So if good things happening more because AI made them cheap is an advantage of AI, then bad things happening more because AI made them cheap is a disasvantage of AI.


Well yes? Have you played the game? It's got lot of content and is very pretty.

Those high resolution textures will just take space. You could obviously decrease the graphical fidelity but I'd guess that most players (me very much included) would rather play a very pretty 23GB Helldivers II than a 5GB ugly Helldivers II.

150GB was very annoying, ironically forcing me to install it to a HDD. 23GB isn't even worth thinking about for me.


You assume that detecting AI content is trivial. It isn't.


> I guess just don’t go there.

How do you know? A lot of the stuff I see online could very much be produced by LLMs without me ever knowing. And given the economics I suspect that some of it already is.


Writing nice sounding text used to require effort and attention to detail. This is no longer the case and this very useful heuristic has been completely obliterated by LLMs.

For me personally, this means that I read less on the internet and more pre-LLM books. It's a sad development nevertheless.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: