> are the LLMs wrong sometimes, yes. But so are humans so it's not really a novel thing to point out.
The thing with humans is that you can build trust. I know exactly who to ask if I have a question about music, or medicine, or a myriad of other topics. I know those people will know the answers and be able to assess their level of confidence in them. If they don’t know, they can figure it out. If they are mistaken, they’ll come back and correct themselves without me having to do anything.
Comparing LLMs to random humans is the wrong methodology.
> This is no different than another type of combustable engine that was created that has negative consequences on the environment in different ways.
Combustible engines don’t make it easy to spy on people, lie to them, and undermine trust in democracy.
The thing with humans is that you can build trust. I know exactly who to ask if I have a question about music, or medicine, or a myriad of other topics. I know those people will know the answers and be able to assess their level of confidence in them. If they don’t know, they can figure it out. If they are mistaken, they’ll come back and correct themselves without me having to do anything. Comparing LLMs to random humans is the wrong methodology.
> This is no different than another type of combustable engine that was created that has negative consequences on the environment in different ways.
Combustible engines don’t make it easy to spy on people, lie to them, and undermine trust in democracy.