To me it seems like LLMs are basically memory for humans as a whole. By interfacing with them, you can extract the knowledge, eliminating the need to remember things.
This has been the case for a while with search engines. I'm convinced our brains have evolved (atrophied?) to avoid having to remember things that you can simply look up on your phone in a matter of seconds.