Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Only in a very anthropocentric sense.

yes, that is the sense in which we are discussing intelligence in order to debate whether the human brain and LLMs operate on similar phenomena



The fact that both humans and LLMs can both reason abstractly is an uninteresting fact if we define “abstract reasoning” to be exactly what humans do, and then create models with the goal of recreating exactly that. This is than simply a statement of an accurate model, and the word intelligence is there only to confuse.

This would be like finding a flower which produces a unique fragrance, then create a perfume which approaches the same fragrance and then conclude that since these are the only two things in the universe which can create this fragrance there must be something special about that perfume.


i would define abstract reasoning as composing and manipulating a model of reality or other complex system in order to make predictions

> is an uninteresting fact if we define “abstract reasoning” to be exactly what humans do, and then create models with the goal of recreating exactly that

if you find this uninteresting, we have perhaps an irreconcilably differing view of things


Your definition excludes language models, as they are in and of them selves just a model which interpolates from data (i.e. makes predictions). But your definition also includes lots of other systems, most mammalian brains construct some kind of models of reality in order to make predictions. And we have no idea whether other systems (such as fungal networks or ant colonies) do that.

I’m not saying these language models—or my hypothetical perfume—aren’t an amazing feat of technology, however neither has any deep philosophical implications about shared properties other than the ones constructed to do so. Meaning, even if LLMs and humans are the only two things in the universe that can reason abstractly in the same way humans do, that doesn’t mean the latter has any more properties shared with the former.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: