I don't understand that comment. "Sensor fusion" is the integration of data from two or more different sensors. It doesn't matter what those sensors are. They don't even have to be different types of sensors.
It may well be that the Uber cars only have LIDAR and optical, but that doesn't have anything to do with sensor fusion.
> Sensor Fusion typically merges LiDAR with stereoscopic camera feeds, the latter requiring ambient light.
So might LIDAR data get de-emphasized when there's too little light for the cameras?
0) https://news.ycombinator.com/item?id=16619917