Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Because of the orientation of Tesla’s cameras, the person said, it was sometimes hard to discern the location of the tractor-trailers. In one view, the truck could appear to be floating 20 feet above the road, like an overpass. In another view, it could appear 25 feet below the ground.

What sort of camera or lenses are they using that causes something to appear 20 feet underground or 20 feet in the sky?



It sounds like it's less about the type of lens that they use and more about the fact they put the cameras in awkward places (presumably to preserve the aesthetic of the car), looking down and looking up in ways that make it hard to judge distance.

It doesn't help that they also chose to avoid using lidar or radar, because Musk was adamant that if human eyes can do it with just visual light then his computers would too.


I googled a bit (and read the model 3 handbook) but found 1 source that said Tesla's forward collision warning does use radar.

Anyone more knowledgable know whether that's true?

I'm curious how much it's relying on object recognition via cameras vs. the more traditional radar based foward collision braking that are standard on "dumb" modern cars.

...or in other words, would a "dumb" modern car still fail in the same scenario?


It did, but they removed it in 2021, and I think disabled it in software of cars shipped with it


They disabled in software, and disabled the hardware on many vehicles taken in for unrelated service.


Wow I guess you don’t own your Tesla if they can randomly disable features




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: