Tesla received some unwelcome news from the National Transportation Safety Board on Wednesday. The NTSB determined that both driver error and Tesla’s Autopilot design were the probable causes of a crash early last year.
The NTSB has not been a fan of Tesla’s Autopilot system, having frequently pummeled Elon Musk’s baby for its poor design.
The NTSB’s report does offer a curious paradox in regards to the crash. The NTSB criticized Tesla because its Autopilot didn’t require the driver to keep his hands on the wheel for most of the time that he’d been driving (14 minutes).
So on the one hand, the NTSB is criticizing Tesla for its Autopilot system apparently being too good in that it was able to handle much of the driving on its own.
Then again, 14 minutes of safe driving that ends up in a serious crash – even though nobody was hurt – is not what most car manufacturers would tout as a grand achievement.
On the other hand, the NTSB faulted the driver for his “inattention and overreliance” on the Autopilot system. Isn’t that the point?
That this is a paradox is not surprising considering Tesla’s own paradoxical instructions regarding Autopilot.
On the one hand, the whole point of Autopilot is that a car is able to drive itself safely in traffic. On the other hand, Tesla advises drivers to keep their hands on the wheel and pay attention while driving as they normally would.
What’s the point of having a robot if you’re the one inside the robot suit?
The Autopilot system does deliver intermittent warnings for drivers to put their hands back on the wheel, but it’s only a warning, and we know that people are lazy. Plus, to beat a dead horse, why would you want to put your hands on the wheel of a car that is supposed to be able to drive itself?
All of this speaks to a fundamental long-term problem with Tesla as a legitimate car manufacturing company and as a stock.
While Elon Musk runs around like a carnival barker talking about flying cars, maybe he should instead focus on just getting the cars with the most basic of features to roll off the overworked assembly lines without requiring electrical tape.
This Autopilot situation also highlights why true self-driving cars will take a very long time to reach the mass market.
Who wants to use a car’s self-driving ability when one may expose oneself to additional liability should there be an accident?
Ed Butowsky, Managing Partner at Chapwood Capital Investment Management in Dallas, tells CCN:
“If the NTSB is issuing paradoxical statements about the cause of this kind of accident, what makes anybody think that insurance companies are going to want to have anything to do with this grand adventure? Nevermind actually getting self-driving cars to work well enough so that the mass-market trusts them, how long will it take insurance regulators to figure out who might be at fault in a self-driving car accident?”
What if both cars involved in an accident were using the self-driving feature at the time? Robots can’t be held liable. Will the car manufacturer be held liable? Will the drivers be held liable for being “inattentive?”
There’s a very long way to go before any of this shakes out, and none of it looks good for Tesla.
Last modified: September 4, 2019 20:45 UTC