Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Or they recognised the fundamental problem which is that LLM's will kill the motivation for people to upload new information to feed the LLM's with.

If websites can't earn a living through ads because LLM's don't send them any traffic anymore, where do the LLM's ingest new information from?

The web continues to deteriorate because of this and nobody has solved this problem.



I’m pretty sure they were just terrified of the PR impact of another “humans classified as gorillas” type incident.


And they ended up with "put elmers glue on pizza" type incident.

AI is messy


I agree that they recognized the fundamental problem but I have different speculation as to what the problem is: the problem must be it's technologically too boring and obviously not profitable to them.

It's culmination of decades of NLP researches, but it's also just East Asian predictive text percussively adapted for variable word length language. It must've been clear to whoever core people that it's fundamentally a sluggish demo or at most a cheap outright-sold commodity product.

IMO they would be right unless someone finds a reason that LLMs can't ever be self hosted in the way Google Search can't be. They must have just saw by the way it is that LLM is at best a regularization engine for that magic box.


It’s fundamental, IMO.

LLMs have to have something ‘true’ (or at least statistically probable) to validate against for training.

Taking the output from it and feeding it back into itself is, essentially, an Ouroboros. Or like locking a human in solitary.

The more/longer it happens, the more deranged it’s going to get.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: