The point is that from now on, there will be nothing really new, nothing really original, nothing really exciting. Just endless stream of re-hashed old stuff that is just okayish..
Like an AI spotify playlist, it will keep you in chains (aka engaged) without actually making you like really happy or good. It would be like living in a virtual world, but without having anything nice about living in such a world..
We have given up everything nice that human beings used to make and give to each other and to make it worse, we have also multiplied everything bad, that human being used to give each other..
Each solvable problem contains its solution intrinsically, so to speak, it’s only a matter of time and consuming of resources to get to it. There’s nothing creative about it, which is I think what OP was alluding to (the creative part). I’m talking mostly mathematics.
There’s also a discussion to be made about maths not being intrinsically creative if AI automatons can “solve” parts of it, which pains me to write down because I had really thought that that wasn’t the case, I genuinely thought that deep down there was still something ethereal about maths, but I’ll leave that discussion for some other time.
Because economy. Look at marvel movies, do you think the latest one is really new? Or just a rehash of what they found working commercially? Look at all the AI generated blog posts that is flooding the internet..
LLMs might produce something new once in a long while due to blind luck, but if it can generate something that pushes the right buttons (aka not really creative) to majority of population, then that is what we will keep getting...
I don't think I have to elaborate on the "multiplying the bad" part as it is pretty well acknowledged..
I think there's demonstrably very little difference at all between human and AI outputs, and that's exactly what freaks people out about it. Else they wouldn't be so obsessed with trying to find and define what makes it different.
The Thesis of Everything is a Remix is that there is no difference in how any culture is produced. Different models will have a different flavor to their output in the same way as different people contribute their own experiences to a work.
> I think there's demonstrably very little difference at all between human and AI outputs
Bold claim, as the internet is awash with counterexamples.
In any case, as I think this conversation is trending towards theories of artistic expression, “AI content” will never be truly relatable until it can feel pleasure, pain, and other human urges. The first thing I often think about when I critically assess a piece of art, like music, is what the artist must have been feeling when they created it, and what prompted them to feel that way. I often wonder if AI influencers have ever critically assessed art, or if they actually don’t understand it because of a lack of empathy or something.
And relatability, for me, is the ultimate value of artistic expression.
In any case, as I think this conversation is trending towards theories of artistic expression, “AI content” will never be truly relatable until it can feel pleasure, pain, and other human urges. The first thing I often think about when I critically assess a piece of art, like music, is what the artist must have been feeling when they created it, and what prompted them to feel that way.
I recently watched "Come See Me in the Good Light", about the life and death of poet Andrea Gibson. I find their poetry very moving, precisely because it's dripping with human emotion.
Or at least, that's the story I tell myself. The reality is that I perceive it to be written by a human full of emotion. If I were to find out it was AI, I would immediately lose interest, but I think we're already at the point where AI output is indistinguishable from human output in many cases, and if I perceive art to be imbued with human emotion, the actuality of it only matters in terms of how it shapes my perception of it.
I'm not really sure where we'll go with that from here. Maybe art will remain human-created only, and we'll demand some kind of proof of its provenance of being borne of a human mind and a human heart. Or maybe younger generations will really care only about how art makes them feel, not what kind of intelligent entity made it. I really don't know.
> Bold claim, as the internet is awash with counterexamples.
What do you consider a counterexample? Because I've been involved in local politics lately, and can say from experience that any foundation model is capable of more rational and detailed thought, and more creative expression, than most of the beloved members of my community.
If you're comparing AI to the pinnacle of human achievement, as another commenter pointed to Shakespeare, then I think the argument is already won in favor of AI.
> I think there's demonstrably very little difference at all between human and AI outputs
Counterexamples range from em-dashes, “Not-this, but-that”, people complaining about AI music on Spotify (including me) that sounds vaguely like a genre but is missing all of the instrumentation and motifs common to that genre.
The rest of your comment I don’t even know how to respond to, to be honest.
You’re really going to make the claim that there are no counterexamples of human and AI output being indistinguishable on the internet? At least make the counterclaim that “those are from old models, not the newest ones”, that’s more intellectually invigorating than the comment you just provided.
> claim that there are no counterexamples of human and AI output being indistinguishable on the internet?
Is that a claim I've made? I don't see it anywhere. I think a lot of people think that because they can get the AI to generate something silly or obviously incorrect, that invalidates other output which is on-par with top-level humans. It does not. Every human holds silly misconceptions as well. Brain farts. Fat fingers. Great lists of cognitive biases and logical fallacies. We all make mistakes.
It seems to me that symbolic thinking necessitates the use of somewhat lossy abstractions in place of the real thing, primarily limited by the information which can be usefully stored in the brain compared to the informational complexity of the systems being symbolized. Which neatly explains one cognitive pathology that humans and LLMs share. I think there are most certainly others. And I think all the humans I know and all the LLMs I've interacted with exist on a multidimensional continuum of intelligence with significant overlap.
I hereby rebuff your crude and libelous mischaracterization of my assertion. How's that? :)
You said AI works were easily distinguishable via em-dashes and "not this, but that"
I said I have witnessed humans using that metric accuse other humans here on hackernews. Q.E.D.
You've asserted that they are easily distinguished. Practitioners in the field fail to distinguish using the same criteria. Is that not dispositive? Seems like it to me.
I claimed much earlier in the thread "I think there's demonstrably very little difference at all between human and AI outputs" which is consistent with "I think all the humans I know and all the LLMs I've interacted with exist on a multidimensional continuum of intelligence with significant overlap."
Two ways of saying the same thing.
Both of them suggesting that sometimes you may be able to tell it's the output of an AI or Human, sometimes not. Sometimes the things coming out of the AI or the Human might be smart in a way we recognize, sometimes not. And recognizing that humans already exist on quite a broad scale of intelligences in many axes.
I was not saying that LLMs cannot produce something like pinnacle of human achievement. I was saying we cannot quantify the difference between Shakespeare and something commonplace, because it requires the ability to feel.
> demonstrably very little difference at all between human and AI outputs
Is there "demonstrably" a lot of difference between Shakespeare and an HN comment?
The point is exactly that there is no such difference. And that it enables slop to be sold as art. And that exactly is the danger. But another point is we had the even before LLMs. And LLMs just make it more explicit and makes it possible at scale.
Conrad Gessner had the very same complaint in the 16th century, noting the overabundance of printed books, fretting about shoddy, trivial, or error-filled works ( https://www.jstor.org/stable/26560192 )
AI can both explore new things and exploit existing things. Nothing forces it to only rehash old stuff.
>without actually making you like really happy or good.
What are you basing this off of. I've shared several AI songs with people in real life due to how much I've enjoyed them. I doing see why an AI playlist couldn't be good or make people happy. It just needs to find what you like in music. Again coming back to explore vs exploit.
I've found several posts on moltbook funny. I don't really like regular jokes in general and I don't find human ones particularly funny either. I don't think we are at the point of being able to be reliable funny, but it definitely seems possible from my perspective.
Yesterday it was "LLM's can't count R's in 'strawberry'." Today it's "LLM's can't tell jokes". Tomorrow it might be "LLM's can't do (X)", all while LLMs get better and better at every objection/challenge posed.
The problem as I see it is that you have a fundamental objection to categorizing the way LLMs do their work as in any way related to "real gosh-darn human thinking". Which I think is wrong. At the root, we are just information-processing meat that happens to have had millions of years to optimize for speed, pattern recognition, feedback, etc.
Is it because the AI is trained with existing data? But, we are also trained with existing data. Do you think that there's something that makes human brain special (other than the hundreds of thousands years of evolution but that's what AI is all trying to emulate)?
This may sound hostile (sorry for my lower than average writing skills), but trust me, I'm really trying to understand.
>We have given up everything nice that human beings used to make and give to each other and to make it worse, we have also multiplied everything bad, that human being used to give each other..
The point is that from now on, there will be nothing really new, nothing really original, nothing really exciting. Just endless stream of re-hashed old stuff that is just okayish..
Like an AI spotify playlist, it will keep you in chains (aka engaged) without actually making you like really happy or good. It would be like living in a virtual world, but without having anything nice about living in such a world..
We have given up everything nice that human beings used to make and give to each other and to make it worse, we have also multiplied everything bad, that human being used to give each other..