If it autonomous or self-driving then why is the person in the car paying for the insurance? Surely if it's Tesla making the decisions, they need the insurance?
If your minor child breaks something, or your pet bites someone, you are liable.
This analogy may be more apt than Tesla would like to admit, but from a liability perspective it makes sense.
You could in turn try to sue Tesla for defective FSD, but the now-clearly-advertised "(supervised)" caveat, plus the lengthy agreement you clicked through, plus lots of lawyers, makes you unlikely to win.
Because that's the law of the land currently.
The product you buy is called "FSD Supervised". It clearly states you're liable and must supervise the system.
I don't think there's law that would allow Tesla (or anyone else) to sell a passenger car with unsupervised system.
If you take Waymo or Tesla Robotaxi in Austin, you are not liable for accidents, Google or Tesla is.
That's because they operate on limited state laws that allow them to provide such service but the law doesn't allow selling such cars to people.
That's changing. Quite likely this year we will have federal law that will allow selling cars with fully unsupervised self-driving, in which case the insurance/liability will obviously land on the maker of the system, not person present in the car.
Risk gets passed along until someone accepts it, usually an insurance company or the operator. If the risk was accepted and paid for by Tesla, then the cost would simply be passed down to consumers. All consumers, including those that want to accept the risk themselves. In particular, if you have a fleet of cars it can be cheaper to accept the risk and only pay for mandatory insurance, because not all of your cars are going to crash at the same time, and even if they did, not all in the worst way possible. This is how insurance works, by amortizing lots of risk to make it highly improbable to make a loss in the long run.
Seems like the role of the human operator in the age of AI is to be the entity they can throw in jail if the machine fails (e.g. driver, pilot)
> Surely if it's Tesla making the decisions, they need the insurance?
Why surely? Turning on cruise control doesn't absolve motorists of their insurance requirement.
And the premise is false. While Tesla does "not maintain as much insurance coverage as many other companies do," there are "policies that [they] do have" [1]. (What it insures is a separate question.)
[1] https://www.sec.gov/ix?doc=/Archives/edgar/data/0001318605/0...
I think there is an even bigger insurance problem to worry about: if autonomous vehicles become common and are a lot safer than manual driven vehicles, insurance rates for human driven cars could wind up exploding as the risk pool becomes much smaller and statistically riskier. We could go from paying $200/month to $2000/month if robo taxis start dominating cities.
That's probably the future; Mercedes currently does do this in limited form:
https://www.roadandtrack.com/news/a39481699/what-happens-if-...
Why ship owner is paying for the insurance while it's a captain making all decisions?
> If it autonomous or self-driving then why is the person in the car paying for the insurance? Surely if it's Tesla making the decisions, they need the insurance?
Suppose ACME Corporation produces millions of self-driving cars and then goes out of business because the CEO was embezzling. They no longer exist. But the cars do. They work fine. Who insures them? The person who wants to keep operating them.
Which is the same as it is now. It's your car so you pay to insure it.
I mean think about it. If you buy an autonomous car, would the manufacturer have to keep paying to insure it forever as long as you can keep it on the road? The only real options for making the manufacturer carry the insurance are that the answer is no and then they turn off your car after e.g. 10 years, which is quite objectionable, or that the answer is "yes" but then you have to pay a "subscription fee" to the manufacturer which is really the insurance premium, which is also quite objectionable because then you're then locked into the OEM instead of having a competitive insurance market.
Because the operator is liable? Tesla as a company isn't driving the car, it's a ML model running on something like HW4 on bare metal in the car itself. Would that make the silicon die legally liable?
Not all insurance claims are based off of the choices of the driver.
It’s because you bought it. Don’t buy it if you don’t want to insure.
Not an expert here, but I recall reading that certain European countries (Spain???) allow liability to be put on the autonomous driving system, not the person in the car. Does anyone know more about this?
The coder and sensor manufacturers need the insurance for wrongful death lawsuits
and Musk for removing lidar so it keeps jumping across high speed traffic at shadows because the visual cameras can't see true depth
99% of the people on this website are coders and know how even one small typo can cause random fails, yet you trust them to make you an alpha/beta tester at high speed?
It isn't fully autonomous yet. For any future system sold as level 5 (or level 4?), I agree with your contention -- the manufacturer of the level 5 autonomous system is the one who bears primary liability and therefore should insure. "FSD" isn't even level 3.
(Though, there is still an element of owner/operator maintenance for level 4/5 vehicles -- e.g., if the owner fails to replace tires below 4/32", continues to operate the vehicle, and it causes an injury, that is partially the owner/operator's fault.)
Generally speaking, liability for a thing falls on the owner/operator. That person can sue the manufacturer to recover the damages if they want. At some point, I expect it to become somewhat routine for insurures to pay out, then sue the manufacturer to recover.