I own two Tesla’s. When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving. Elon has said several times that humans can drive with two eyes and Tesla should be able to drive with X number of cameras. however, it suffers from the same problems humans do: if it can’t see it can’t drive and ironically that’s when it reverts back to human control.
> When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving.
I also own a Tesla, and there is no indication shown to the user that FSD's vision is degraded. They need to add this in.
For example, numerous times I have been driving my Tesla with FSD activated with ostensibly a clean and clear windshield when suddenly the car will do the "clean the windshield in front of the camera routine" without any indication that the car's camera is degraded. If people haven't seen this "clean the windshield routine", the wiper fluid is dispensed and the wiper will vigorously wipe in front of the camera only -- the rest of the windshield only gets a cursory wipe.
This indicates to me that the camera has poor visibility and I am not informed or aware of this as a driver, which is concerning. I am often curious if there is a thin occluding film on the windshield in the camera box in front of the camera, or something that has degraded FSD's vision, but they do not give you the ability to view the camera feed, nor do they notify you that the vision is degraded. I think a "thin occluding film" may be in the camera box because my normal windshield outside of the camera box started to show a thin chemical film after a couple of months, which apparently (according to a Google search) happens when a new car off-gasses, adding a thin film of chemical byproduct to the windshield. This is my first new car so I've no idea if this is normal or not.
Birds can fly with two wings and humans should be able to fly with X number of limbs.
> humans can drive with two eyes and Tesla should be able to drive with X number of cameras
Systems built from cameras that are only nearly as capable as human eyes and software that is only nearly as capable as the human brain will fall short overall. To match or surpass human performance, the individual components need to exceed human abilities where possible--and that's where LiDAR provides an advantage.
> if it can’t see it can’t drive and ironically that’s when it reverts back to human control.
That seems to be better than that try to continue to run the vehicle. What would you expect it to do?
If it is foggy I just don't drive, anybody that expects me to drive when conditions are bad can go and drive themselves.
My Lexus does this too. I rarely get it due to weather however it’s how I know I’m past due for a car wash (dust on sensors)
In any case, it seems reasonable to me that the human should be making the decisions once conditions become adverse. It’s a simple liability issue for the car company but also I’d rather trust my own judgment if it’s only 80% certain it’s not driving me off a cliff.
Well that, but Elon is also downplaying the quality of the human vision system compared to the cameras Tesla's have.
They're just not that good - nowhere near human vision performance. And a human in a car has a surprisingly good view of the road and a very fast pan tilt system to look around.
Tesla's do not actually have 360 degree full binocular vision coverage - nor the ability for a camera to lean left or right to improve an ambiguous sensor picture.
So while I fully believe that vision only self driving could work, within the constraints of automobile platforms and particularly the Tesla and it's current camera deployments, it is not remotely similar enough to human visual fidelity for that to solve a valid argument.
>> "Elon has said several times that humans can drive with two eyes and Tesla should be able to drive with X number of cameras"
This must be one of the most stupid takes that gets repeated non stop by Tesla fans.
I just don't get it. Humans also have emotions and other biological senses that Computers don't have. You just cannot compare both. What makes human so good at driving is that they can react relatively well to unknown new events. Teslas cannot do that, and with the current hardware never will.
> Elon has said several times that humans can drive with two eyes and Tesla should be able...
And this is an amazingly stupid statement. Humans drive with most of their senses, not just vision. In fact our proprioception plays an important role in driving.
Even Tesla's use of cameras is poor because they're monocular and fixed in place. Most humans have binocular vision and those visual sensors have multiple degrees of freedom and the ability to adjust focus.
Even if you wanted to only use vision for navigation it's irresponsible to not use binocular configurations that get more reliable depth sensing.
Which is absolutely why Tesla's stats can't be trusted.
"Number of miles driven in situations where [quality of conditions is greater than some threshold] versus all conditions."
"If you don't count the games we'd definitely have lost, our winning percentage is so much higher!"
"Elon has said several times "
At this point I truly don't understand why anyone cares what that liar says.
I definitely agree that in principle a computer can drive with cameras alone. I don't know whether it's a useful statement. Like a human can determine the genre of a movie merely by watching it. I wouldn't suggest to blockbuster in 1990 that they should collect no genre metadata for movies because the database server should automatically sort it out on its own. (Nowadays somewhat feasible with ML of course, but 20+ years later.) What sensors/data you need is a question of where computers are now or will shortly be, and it seems that for now they need the extra structure of LIDAR for best effectiveness.