I think vision-only can certainly work for 99.9% of driving.
But it's that 0.1% of situations where the results will be catastrophic. Sure, you can detect vehicles, traffic cones, bikes (both bicycles and motorcycles), people, mopeds, traffic lights, lane markings, everything you'd expect on a road.
But what about the mattress that fell out of someone's truck? If the car doesn't know what a mattress is and what it looks like, it can't really adequately determine its size based on the monocular vision that Tesla has. Sure, maybe it could use motion vectors between video frames to make a guess, but I'm not convinced that's going to work well, especially relative to LIDAR.
Steering back to the subject at hand...
> "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."
I don't think I've ever had my Tesla disable Autopilot based on road conditions, though maybe it's because when conditions are bad, I've just taken manual control preemptively. I've let it go through construction areas where cones are guiding traffic outside the painted lines, and surprisingly, it's handled it fine, though I've only done this at low speeds (~20 mph).
Camera visibility is another story. In heavy rain at night, I've had it not allow me to enable AP, though I've never had it disable AP and tell me to take control. However, it HAS limited the cruise speed based on visibility.
All this to say...
...anybody buying Tesla's FSD is being swindled, as far as I'm concerned. "FSD (Supervised)" is a scam. If you have to supervise it, it's not self-driving. It's just a party trick that you have to watch to make sure nothing goes wrong.
99.9%? I'm not an expert on climate, but I would guess that at minimum 50% of the world faces snow or fog or heavy rain while driving at times. In some places, 30%+ of all driving year-round could be in snow-inclusive conditions.
99.9% of driving of sea-level, non-rainy, near-the-coast California/Austin weather, maybe. I would guess it's a no-go in the inland foggy conditions in CA, for example, or freezing rain in TX.
I once saw a presentation I think from one of the Argo ai guys with a greatest hits reel of the long tail of driving. One of them was a stake bed truck with a bunch of pigs in the back, and the back gate opened up and the pigs were falling onto the highway and running around injured. Most people will experience something at this level of unusualness at least a few times in their lives, so you have to be prepared to handle it.
> vision-only can certainly work for 99.9% of driving
Human vision only? Sure. Cameras only? I'm sceptical of the quality of these cameras. Particularly given, unlike the human eye, they don't do the rapid involuntary movements that make human eyes ridiculously robust.
The difference between HW4 and older Teslas (before 2023) is night and day. HW4 absolutely can handle 99.9% or more of driving now whereas the older models drive like an idiot. I've never had to intervene on HW4 yet. It did well avoiding debris and animals, even seeing them heading towards the road. Worked well in heavy rain and snowstorms, even when lanes were barely visible. Handled moving and parked emergency vehicles with lights on correctly. Navigated through roads undergoing work (lanes shifted, closed off, pylons everywhere). Automatic parking actually works but not all the time yet.
No software update is going to give working FSD to all those older models on the road. I don't think they can even have HW4 retrofitted either. Ideally FSD would be disabled on those cars because it's confirmed never going to work properly but people paid for lifetime FSD on them (newer cars are subscription only).
Even if 99.9% is true (and there’s no reason to think it is), that’s nowhere near reliable enough to match Waymo or do truly autonomous driving. The average human is much better than 99.9%.
That's just me being pessimistic because I've seen how terrible FSD is on older Teslas, and only have personal experience using it in HW4 for ~2 months.
It's really unfortunate that FSD is judged as a singular feature when it is vastly different depending on the car. HW3 cars are getting cut down versions of FSD software, which is useless garbage because it drives like an idiot. HW2 and earlier are probably not getting FSD updates at all anymore. The majority of Teslas on the road are HW3 or earlier, run significantly worse (unsafe) FSD versions, and yet Tesla still allows it to be turned on.
But it's that 0.1% of situations where the results will be catastrophic. Sure, you can detect vehicles, traffic cones, bikes (both bicycles and motorcycles), people, mopeds, traffic lights, lane markings, everything you'd expect on a road.
But what about the mattress that fell out of someone's truck? If the car doesn't know what a mattress is and what it looks like, it can't really adequately determine its size based on the monocular vision that Tesla has. Sure, maybe it could use motion vectors between video frames to make a guess, but I'm not convinced that's going to work well, especially relative to LIDAR.
Steering back to the subject at hand...
> "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."
I don't think I've ever had my Tesla disable Autopilot based on road conditions, though maybe it's because when conditions are bad, I've just taken manual control preemptively. I've let it go through construction areas where cones are guiding traffic outside the painted lines, and surprisingly, it's handled it fine, though I've only done this at low speeds (~20 mph).
Camera visibility is another story. In heavy rain at night, I've had it not allow me to enable AP, though I've never had it disable AP and tell me to take control. However, it HAS limited the cruise speed based on visibility.
All this to say...
...anybody buying Tesla's FSD is being swindled, as far as I'm concerned. "FSD (Supervised)" is a scam. If you have to supervise it, it's not self-driving. It's just a party trick that you have to watch to make sure nothing goes wrong.