Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Embodiment ( virtual or robotic) will be the key to high utility AGI. The lack of embodiment hinders the rendering of the self so prevents entire classes of causality. Without causality, it is very difficult to link ideas to the real world apart from creative regurgitation.


Counterpoint: humans have embodiment and are famously terrible at causality (well...famously might be an overstatement as it seems to be not well known).


Heh, I’d say we’re pretty good at it. I mean like predicting that a vase will fall if you release it in mid air and things like that.

The precise type of understanding that is so obvious it won’t be found in scrapes like the common crawl, but is not obvious at all unless you either have a very high level understanding of physics and are modelling everything… or if you are a creature that exists in the world for a while.


> Heh, I’d say we’re pretty good at it. I mean like predicting that a vase will fall if you release it in mid air and things like that.

Now do metaphysics, just one component of which is:

https://plato.stanford.edu/entries/causation-counterfactual/

PS: this is not currently done in our culture (in the same ways and with the same standards and desire for quality that physics is done), so if you're basing your implementation on prior examples, it will be incorrect. This is not to say that it will be wrong, but it will be incorrect - the distinction between these two seemingly synonymous terms lies within culture.


A simple causal graph of "release -> observe drop" is not what Pearl is referring to when he talks about causality. He's talking about more complicated causal graphs where some hidden variables affected both the release and the fact that the vase was observed to drop, which can require careful experimental setup to figure out. "Release -> observe drop" with no other variables is something an associative model can learn very easily.


Right — but it won’t learn those thousands of such rules, which in collection lead to “common sense” about the world, if it has no access to those, ie, if it’s disembodied.


AI is now "in the wild" though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: