Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

the real innovation is that neural networks are generalized learning machines. LLMs are neural networks on human language. The implications of world models + LLMs will take them farther


The neural net was invented in the 1940s, and LLMs were created in the 1960s. It's 2025 and we're still using 80yo architecture. Call me cynical, but I don't understand how we're going to avoid the physical limitations of GPUs and data to train AIs on. We've pretty much exhausted the latter, and the former is going to hit sooner rather than later. We'll be left at that point with an approach that hasn't changed much since WW2, and our only solution is going to hit a physical limit law.

Even in 2002, my CS profs were talking about how GAI was a long time off bc we had been trying for decades to innovate on neural nets and LLMs and nothing better had been created despite some of the smartest people on the planet trying.


they didnt have the compute or the data to make use of NNs. but theoretically NNs made sense even back then, and many people thought they could give rise to intelligent machines. they were probably right, and its a shame they didnt live to see whats happening right now


> they didnt have the compute or the data to make use of NNs

The compute and data are both limitations of NNs.

We've already gotten really close to the data limit (we aren't generating enough useful content as a species and the existing stuff has all been slurped up).

Standard laws of physics restrict the compute side, just like how we know we will hit it with CPUs. Eventually, you just cannot put things closer together that generate more heat because they interfere with each other because we hit the physical laws re miniaturization.

No, GAI will require new architectures no one has thought of in nearly a century.


We have evidence that general intelligence can be produced but a bunch of biological neurons in the brain and modern computers can process similar amounts of data to those so it's a matter of figuring how to wire it up as it were.


Despite being their namesake, biological neurons operate quite distinctly from neural nets. I believe we have yet to successfully model the nervous system of the nematodes, with a paltry 302 neurons.


dude who cares about data and compute limits. those can be solved with human ingenuity. the ambiguity of creating a generalized learning algorithm has been solved. a digital god has been summoned




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: