Hackers Trick Tesla Autopilot Using Little Road Stickers

Autopilot Lane Attack Hack

Researchers in China have demonstrated that certain aspects of Tesla’s Autopilot system is susceptible to hacking. Cybersecurity experts from the well-regarded Keen Labs formulated two types of hacks to test out the security of Tesla’s system to determine whether or not they could make the car behave in a way that was different than the driver’s intention. The purpose of the experiments was to determine the vulnerability of the system.

“Fake-Lane” Attack

The first experiment the researchers conducted involved placing interference stickers on a test track to see if they could get the car to move into another lane. The car they used had firmware update 2018.6.1 and it was operated in Autosteer mode. What the researchers found is that they were able to get the car to respond with relatively few stickers and the car was easily tricked into going into the adjacent lane.

Here’s the video (about one minute in):

They did this by exploiting how Tesla’s neural network detects lanes via the “detect_and_track” function, shown below in this schematic.

Autopilot Lane Detection Algorithm

The researchers noted:


Tesla uses a pure computer vision solution for lane recognition, and we found in this attack experiment that the vehicle driving decision is only based on computer vision lane recognition results.

To help prevent this in the future they suggested that Tesla employ reverse lane detection so if the computer believes the fake lane change points to the oncoming traffic lane, it can ignore that maneuver. In all likelihood, a combination of detailed maps, recent inputs from other Tesla vehicles and reverse lane detection will be required in the future.

Game Controller

The second experiment involved steering the car with a simple game controller. The Keen Labs researchers also used a mobile device as an intermediary between the game controller and the steering controls. What they found was that, in certain situations, they could completely control the steering wheel and cause the car to turn at will.

A Different Experiment

In another, unrelated experiment, researchers were able to turn on the windshield wipers by simulating a wet windshield even though no water was present. They used an image placed in front of the vehicle’s front fish-eye cameras to trick the windshield wipers into activating. Though they realize that this type of hack would be difficult in the real world, the wipers could still be triggered by combinations of objects within the environment.

The successful hacking of Tesla’s automatic systems demonstrates that there is more work that needs to be done before the system can be completely reliable on its own. For now, a Tesla spokesperson told Forbes that the steering wheel control issue has already been addressed. This same spokesperson also says that drivers can take control of the vehicle and override the automatic system at any time.