China hackers set their sights on Tesla to expose just how easy it was to manipulate the Model S. Not only were they able to trick the autopilot systems of the luxury vehicles, but they've been able to access them remotely. Tencent subsidiary Keen Labs…
China hackers set their sights on Tesla to expose just how easy it was to manipulate the Model S. Not only were they able to trick the autopilot systems of the luxury vehicles, but they’ve been able to access them remotely.
Tencent subsidiary Keen Labs released its findings in a whitepaper called “Experimental Security Research of Tesla Autopilot.” This was the second report in about three years from the highly-regarded research group, which continues to warn that Tesla’s self-driving features leave customers vulnerable.
The widely-regarded Keen Labs started honing in on alleged self-driving issues at Tesla in 2016.
At that time, researchers had conducted several months of in-depth research on Tesla vehicles. In a note, it stated it had discovered “multiple security vulnerabilities. Furthermore, it had:
“successfully implemented remote, aka none physical contact, control on Tesla Model S in both parking and driving modes.”
It added in that note that the vulnerabilities had been presented and confirmed by the Tesla Product Security Team.
Regardless, the findings give a peek into how Tesla’s ambitious efforts to become king of the self-driving space make for a long road ahead.
In its white paper, the research group reveals that it had made three alarming discoveries. It said it had proven that it could:
To trick the Model S into moving from one lane to another without the driver’s permission, Keen Labs created a so-called fake lane. It found that for many major autopilot tasks, Tesla uses a single large network with many outputs, and lane detection is one of those tasks.
Keen Labs tricked the vehicle by putting stickers on lanes at intersections, thinking they would throw off the Tesla, and it would guide to another lane. On a test track, their theory proved correct, and they were able to manipulate the autopilot system to force the Model S to lurch into the left lane.
A Tesla spokesperson told Forbes the issues regarding remote control of the steering wheel had been addressed before the Keen researchers had been in touch (more context below).
However, this isn’t the only Tesla hacking scandal. CCN reported on a “white hat hacker” who has been restoring wrecked Model 3 vehicles and exposing just how much personal data lurks under the hood.
Tesla stock dropped 1.14 percent on the day to $285.88, pushing the company’s market cap below $50 billion.
Update: Tesla has addressed each of Keen’s specific allegations to provide full context.
Tesla’s feedback on Autowipers:
“This research was demonstrated by displaying an image on a TV that was placed directly in front of the windshield of a car. This is not a real-world situation that drivers would face, nor is it a safety or security issue. Additionally, as we state in our Owners’Manual, the ‘Auto setting [for our windshield wipers] is currently in BETA.’ A customer can also elect to use the manual windshield wiper setting at any time.”
Tesla’s feedback on Lane Recognition:
“In this demonstration the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use. This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”
Tesla’s feedback for the “Control Steering System with a Gamepad” Research：
“The primary vulnerability addressed in this report was fixed by Tesla through a robust security update in 2017, followed by another comprehensive security update in 2018, both of which we released before this group reported this research to us. In the many years that we have had cars on the road, we have never seen a single customer ever affected by any of the research in this report.”
Last modified: April 3, 2019 12:55 PM UTC