>> Comparing against some imaginary scenario where cars have no collisions and cause no deaths doesn't make sense.
That's not the whole story. For example, we ban certain kinds of weapons -cluster munitions, chemical weapons, biological weapons, ideally we'd ban bloody mines- not because they kill too many people compared to "conventional" weapons (they don't) but because they are considered especially ... well, wrong, in the moral sense.
So maybe we decide that being killed by a machine, that decides you're a target and pulls the trigger autonomously is especially morally wrong and we don't accept it.
Also, in case of top tier of biological weapon, even a single strike - or a single accident - has potentially unlimited area of effect, up to and including the entire planet.
Remember COVID-19? Whether you believe in it being natural or a lab leak, it is a good model of how a handling mishap with a mediocre bioweapon would look like.
Better bioweapons would potentially be more targeted, and/or have reproductive clocks that disable them after a certain number of generations. But you absolutely run the risk of them evolving away from such restrictions.
Genetic kill timers are production technology that has already been deployed. There are genetically engineered mosquitoes for example that become unviable after a certain number of generations. The idea being that you mix them into the population, they cross breed and spread their genes, then 10 or 50 generations later, they suddenly are infertile en masse and the whole species dies out.
Very big conventional bombs also have similar effects and yet they are not banned, so that's not the difference. The difference is in the way people are killed.
That's not the whole story. For example, we ban certain kinds of weapons -cluster munitions, chemical weapons, biological weapons, ideally we'd ban bloody mines- not because they kill too many people compared to "conventional" weapons (they don't) but because they are considered especially ... well, wrong, in the moral sense.
So maybe we decide that being killed by a machine, that decides you're a target and pulls the trigger autonomously is especially morally wrong and we don't accept it.