Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly. And the person following the Chinese program also knows they don't speak Chinese and that they aren't understanding anything.

I don't really find the Chinese Room argument very compelling because there are too many "it's obvious that X can't really understand" in it.

Also, you can't derive from it that there can't be computed consciousness in some other form.



That feels off. It’s like me saying I don’t know English. I merely know the correct alghorithm to give the correct responses to things people give me as input.


There supposedly is a (semantic) process in your brain that makes you believe you understand the sentences you are reading and writing that is on top of the (symbolic) process that tells you what to say and how to say it. And that's the quid of the issue. Searle argues that symbolic computation cannot produce understanding at the semantic level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: