Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe this boils down to people who think AI is on an exponential (self-improving) curve, materially unbounded by physical resources, and people who think it's on a series of sigmoid curves with material physical constraints.

If someone assumes AI will become significantly more capable than humans at reasoning through complexity, then I can empathize with their opinion. I was previously convinced (open to) this possibility, but in recent years and the better AI gets the clearer it is to me that it's going to take a lot longer, and the super AGI outcome is a lot harder to see.

I'm sure by the time it could possibly be a feasible and positive option people will be plenty ready for it... So no need to prepare prematurely.

TLDR: I agree with you, but without the expletives.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: