Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, ultra-hazardous activities are regulated differently from other activities, including under tort law, but generic AI tools are not ultra-hazardous by nature. No piece of software is, until it is connected in some way to real world effects. Take an object-detection algorithm. There's absolutely nothing inherently dangerous about identifying objects in a video stream. But once you use the algorithm to create an automatic targeting system for a drone with a grenade strapped to it, it does become hazardous. But that's no reason to regulate the algorithm as if it were hazardous itself, at least no more so than it is to regulate the drone. As you point out, we regulate hand grenades. We do not regulate the boxes hand grenades are delivered in, or the web framework used for building a website that can be used to purchase hand grenades.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: