Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maximize human population divided by the time integral of all human suffering (taken from now to the heat death of the universe.)


This kind of stuff is pretty tricky. If you only account for average human suffering, not only do you not account for happiness but you fall into the trap of concluding that it's best to kill all people suffering.


Congratulations, your AI wants to convert all matter in the universe into lobotomized humans.


That's suboptimal; a fully realized human would suffer less than a lobotomized human.

Also, we don't need an AI that's ethically perfect, just equal or better to an average human.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: