Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fully autonomous weapons are a danger even if we can reliably make it happen with or without AI.

It essentially becomes a computer against human. And such software if and when developed, who's going to stop it from going to the masses? imagine a software virues/malwares that can take a life.

I'm shocked very few are even bothered about this and is really concerning that technology developed for the welfare could become something totally against humans.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: