Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not optimistic about this in the short term. Creative and diverse viewpoints seem to come from diverse life experiences, which AI does not have and, if they are present in the training data, are mostly washed out. Statistical models are like that. The objective function is to predict close to the average output, after all.

In the long term I am at least certain that AI can emulate anything humans do en masse, where there is training data, but without unguided self evolution, I don't see them solving truly novel problems. They still fail to write coherence code if you go a little out of the training distribution, in my experience, and that is a pretty easy domain, all things considered.



The vast majority of advances seem to be of the form "do X for Y", where neither X nor Y is novel but the combination is. I have no idea whether AI is going to better than humans at this, but it seems like it could be.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: