There was a study from 2014, looking at how long it takes to regain awareness after taking control from an automated system [0]. (And similar study from 2016[1].) It takes about 15 seconds to regain reasonable control of a car after control is handed over. If your plan is to pull over and hand control to a human in inclement weather, that is a viable plan. If your plan is to recognize an emergency and hand control to a human during an active emergency, the situation has resolved itself within those 15 seconds, one way or another.
A system that has humans in the control loop must be designed around human limitations. Humans, myself included, are not capable of paying continuous attention to a task that requires only sporadic intervention. A control scheme that includes handing control back to a human in emergencies is a death sentence that serves only to obfuscate the cause of the crash and avoid liability.
Yet this is exactly the plan for self-driving cars. Automation level 0 [2] is fine, because there are enough minor corrections to keep the driver focused on the road. Automation levels 4-5 are fine, because the "driver" isn't the one in control of the car. Automation levels 1-3 are actively hazardous, as they remove the moment-to-moment adjustments performed by the driver, but they still expect the driver to maintain full situational awareness to take over at any time.
TL;DR. Automation levels 1-3 assume superhuman focus on the part of the driver, are fundamentally unsafe, and should be regulated against.
My car has level 2 (ACC with steering) and I've found that the whole process is extremely stressful and hard for me. Initial problem was that I did not trust the car to actively steer, brake and accelerate while keeping me centered in the lane.
After a while that wore off but you still had to be hyper vigilant as the system can "fail" at any moment and you need to be in control immediately. I find it harder to do than regular driving as in level 2 you need to "drive" and pay attention to whether the system has stopped working properly or not.
It's an incredibly frustrating. I'm surprised people buy cars just for this capability. I love safety systems but this is just a disaster waiting to happen.
I wonder, why we don't just use these automated systems only as safety backups, keeping the driver in full control of the vehicle? If the driver falls asleep, or an animal or child runs out into traffic, or the vehicle starts drifting into oncoming traffic, the automated driving system can take over and take whatever action it deems necessary to avoid a collision, with an option for the driver to forcefully take back control if necessary. The driver can still use adaptive cruise control, and even lane-keeping for short periods of time (5 minutes), but all other aspects of driving (steering, navigation, signalling, braking, etc.) are the responsibility of the driver.
Best of both worlds then, with the goal of minimizing injuries and deaths. Is there a problem with this that I'm not seeing? Why isn't this the real goal?
I agree that it can be hazardous. But I would like to add that level 3 cars have driver monitoring making sure that the driver is looking at the road. Much better than nothing!
Still, the handover in some of the car brands I have tested are really dangerous - silently hands over to the driver without any notifications.
I agree with you that level 3 is temptingly dangerous in an automobile situation. I spent some time trying to place where a light aircraft autopilot sits on that framework and conclude that it's closest to level 3 and is generally used effectively to reduce workload and drudgery of cruise flight.
A system that has humans in the control loop must be designed around human limitations. Humans, myself included, are not capable of paying continuous attention to a task that requires only sporadic intervention. A control scheme that includes handing control back to a human in emergencies is a death sentence that serves only to obfuscate the cause of the crash and avoid liability.
Yet this is exactly the plan for self-driving cars. Automation level 0 [2] is fine, because there are enough minor corrections to keep the driver focused on the road. Automation levels 4-5 are fine, because the "driver" isn't the one in control of the car. Automation levels 1-3 are actively hazardous, as they remove the moment-to-moment adjustments performed by the driver, but they still expect the driver to maintain full situational awareness to take over at any time.
TL;DR. Automation levels 1-3 assume superhuman focus on the part of the driver, are fundamentally unsafe, and should be regulated against.
[0] https://www.sciencedirect.com/science/article/pii/S136984781...
[1] https://journals.sagepub.com/doi/abs/10.1177/154193121360142...
[2] https://www.jdpower.com/cars/shopping-guides/levels-of-auton...