The parts of our brain that work like an LLM do work like an LLM (hallucinates/generates falsehoods/over-confident/deals strictly with an abstract world of discrete symbols) and vastly overestimates its intelligence and how much it needs the rest of the brain's reality checking and other kinds of intelligence to turn it into something useful and conflates what it does with intelligence.
It's basically a super-useful co-processor in the brain hardware, but it is a co-processor and not capable of independent operation.
Yes. When I read comments such as “humans just predict the next word too,” I wonder if those commenters have ever stopped and observed their own thought processes before.
It's basically a super-useful co-processor in the brain hardware, but it is a co-processor and not capable of independent operation.