Tesla Autopilot Crashes and Causes – April 2019

Updated April 2019

Tesla entered the electric car market in 2006 with the Roadster, but it wasn’t until 2014 that Tesla introduced its original Autopilot 1.0 (AP1) hardware on the Model S. Tesla’s Autopilot allows a driver to engage autonomous technology that takes over the steering, acceleration, and braking of the car.  Since then the Autopilot technology has advanced rapidly, updating hardware to AP2 (see AP1 vs AP2) and had a major step-change with the Version 9 software update in late 2018 that introduced Navigate-on-Autopilot.

Tesla’s Autopilot is a “driver assist” function as of 2018, and the driver is instructed to remain alert with hands on the steering wheel when Autopilot is engaged. This puts Tesla’s Autopilot somewhere between Level 2 and Level 3 on the autonomous driving scale, as defined by SAE International (see this article about self-driving levels and what they mean).

It is by no means a hands-off and eyes-off technology, it requires the driver remain attentive but does allow the car to take over most of the controls so there is less cognitive load on the driver during the trip.

Tesla Pushes the Envelope

Tesla is known for being an extremely innovative company that has disrupted the auto industry by creating electric cars that are not only appealing but also include advanced technology far beyond what most other cars offer (such as built-in Autopilot hardware, built-in cellular, continuous updates, etc.).  It’s a car that keeps improving with software updates, even after you buy it.

One of the Tesla hallmark features is advanced self-driving technology, known as Autopilot. When originally introduced in 2014, known as Autopilot 1.0 (AP1), they partnered with a company called Mobileye (now owned by Intel) to provide the self-driving technology.  Tesla felt Mobileye wasn’t moving fast enough and decided to roll out its own system known as Autopilot 2.0 (AP2), which is what is currently being sold since late 2016. For more see our AP1 vs. AP2 article.

Unlike other auto manufacturers, Tesla rolls out software updates continuously over-the-air (like on a mobile phone), allowing consumers to have immediate access to new, advanced Autopilot updates.

Additionally, Tesla has released Hardware 3 (AP3) and the Full Self-Driving option that will use this new, powerful custom-built AI hardware to even further accelerate the self-driving capabilities of Tesla vehicles equipped with this option.

While this helps advance the technology, it also means that Tesla customers must understand how to use the technology safely.  Tesla clearly states that the consumer is ultimately responsible for using the technology and must remain attentive at all times.  This is unlike other auto manufacturers, such as Volvo or Cadillac, that have far more restrictive self-driving system, but take responsibility for accidents while those systems are active.

Typical Tesla Autopilot Crash Causes

As with any new driver assist technology, there are benefits, but also dangers if used improperly.  Even old-fashioned cruise control can be dangerous if used improperly.  With these more advanced self-driving systems, it’s even more important that drivers pay attention, be in control of the vehicle, and not be lulled into a sense of complacency.

Most common causes for Tesla Autopilot fall into these buckets:

  • Hitting Stationary Objects at High Speed – When you engage Traffic Aware Cruise Control (TACC) with Autosteer at freeway speeds, Automatic Emergency Braking is reduced to avoid false-positive sudden braking (which could cause more accidents).  This has been one of the biggest source of accidents as drivers often become too comfortable and stop paying attention at freeway speeds.
    • Tesla’s owner’s manual states: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
    • This remains the biggest danger currently to using Autopilot – the sudden appearance of stationary objects in the car’s path at freeway speeds. For this reason, it’s always important for the Tesla driver to pay attention at freeway speeds.
  • Autopilot confusion at Forks and Gores – Before the software Version 9 update, Tesla’s Autopilot would often have issues with left-hand freeway exits or forks, often wanting to veer into those exits and potentially hitting the median barrier.
    • With the Version 9 update, that has greatly improved and no longer pulls left on freeway left-hand exits as before.
    • That said, when the freeway splits off to the right, vigilance is required in order to take over at a moments notice.
  • Inability to Recognize and React to Objects – The first fatal Autopilot crash occured when a Tesla Model S on AP1, hit a tractor-trailer crossing a freeway.  Due to the angle of the sun, the Mobileye camera system was not able to pick up the truck and the Model S slammed into it directly.
    • Since then, the software has been adjusted to rely more heavily on the front facing radar for object detection.
    • That said, another, second fatal crash in Florida occured in early 2019 with a Model 3 hitting a tractor-trailer crossing a highway under the same circumstances as the prior accident.

Typical Crash Outcomes

Luckily, Tesla vehicles are some of the safest vehicles on the road, with extremely low death and injury rates. In fact, the Model 3 is the safest vehicle ever tested by NHTSA. Original concerns over potential Lithium-Ion battery fires in electric cars have not proven to be any more dangerous than gasoline fires in regular cars (they may turn out to be even safer).  The two notable Autopilot deaths were, in fact, high-speed collisions with stationary objects that, no matter what car, would not have been survivable. Hence why it’s critically important to always be paying attention at highway speeds in any car, not only Teslas.

Notable Tesla Autopilot Crashes

January 2016: Fatal Crash in China

In January of 2016, Gao Yaning, a 23-year-old Chinese man, was killed when the Tesla Model S he was driving crashed into a street sweeper. An in-car camera captured footage that shows the car merging into the left lane before hitting the road-cleaning truck. There is no video evidence suggesting the brakes were applied prior to impact. Unfortunately, the car sustained extensive damage and was unable to transmit on-board data. For this reason, it is unclear if Autopilot was engaged.

May 2016: Fatal Tractor-Trailer Crash in Florida

The first confirmed Tesla Autopilot fatality occurred in Gainesville Florida on May 7, 2016 when 40 year-old Joshua Brown, driving his AP1 Model S on Autopilot, struck a tractor-trailer crossing a divided highway. The Model S hit the tractor trailer straight on without slowing down and went under the trailer, shearing off the roof Model S before coming to a stop hundreds of feet later.

After an extensive investigation by the National Highway Traffic Safety Administration (NHTSA) and Tesla it was determined that Mr. Brown was not paying attention as the Autopilot system gave several warning beforehand for the driver to put their hands back on the wheel.

The NTSB also reported that Tesla’s software was missing key safeguards that could have prevented the crash by limiting the driver’s use of the Autopilot feature during conditions in which it was not safe to use.

The Problem:

This scenario was difficult for Autopilot to handle, for a couple reasons. First, the cameras had a difficult time distinguishing the white tractor-trailer against the bright sky. Second, the AP1 system, at the time, was tuned to ignore more of the radar inputs at high speed in order to avoid false positives.

The Solution:

Since then, Tesla has reportedly adjusted the sensitivity of the system to better detect these scenarios, stating, “The changes include refinements in Autopilot’s radar that improve its ability to spot and identify obstacles down the road and additional warnings to force drivers to keep their hands on the steering wheel and eyes on the road while the system is active.”

However, in March of 2019, a similar crash occurred with a Model 3 using the most current hardware (AP2) at the time, so these types of scenarios can still occur.

The bottom line, if you’re driving on a highway where there is the possibility of cross traffic suddenly appearing, you must stay vigilant at all times.

August 2016: Lost in Translation

In August 2016, a man in a Tesla Model S sideswiped a parked car on a freeway in China while autopilot was engaged.

The Problem:

The driver stated that the car dealership claimed the car was self-driving. At the time, the Chinese version of the Tesla website used the term “zidong jiashi” to describe the autopilot feature, but the phrase’s most literal translation is “self-driving.”

The Solution:

After the accident investigation, Tesla removed the term “zidong jiashi” from their website.

February 2017: Concrete Barrier Sideswipe

Video footage of an AP1 Tesla Model S crashing into a highway barrier in 2017 shows the car’s inability at the time to adapt to a change in road conditions. It was determined that the driver did not have his hands on the wheel at the time of the accident, and was not able to take control of the system when needed. The Tesla failed to merge in a construction zone, sideswiping the barrier.

The Problem:

The Tesla Model S is designed to stay between the road lines using cameras and sensors, but because the lines continued beyond the barrier, the car’s systems were unable to avoid a collision. The driver was at fault for not remaining alert enough to take control of the system to avoid the road hazard.

The Solution:

New AP2 hardware has improved sensors, and more importantly, the machine learning software continually improves over time, better detecting scenarios like this.

January 2018: Tesla Hits Stopped Fire Truck

A Tesla Model S crashed into the back of a firetruck on a Los Angeles highway at 65 miles per hour in 2018. Luckily, no injuries were sustained. A car that was traveling in front of the Tesla merged into another lane, and the stationary firetruck was undetected by the Tesla’s monitoring system (or didn’t slow down fast enough – it’s still unclear).

Since then additional Teslas on Autopilot have hit firetrucks in this type of scenario since emergency vehicles often are parked in an active lane in order to deal with an accident.

The Problem:

This Tesla crash demonstrates the problem of dealing with suddenly-appearing stationary objects at high speeds and how it remains one of the biggest safety challenges for Autopilot. On the one hand, it could be very dangerous if Autopilot suddenly slammed on the brakes at freeway speeds if it falsely thinks there is a stationary object ahead of it (causing rear-end collisions, etc.).  On the other hand, there should be some sort of warning for drivers, at least.  Either way, it’s critical to pay attention to the road when driving at freeway speeds on Autopilot.

Here’s what the Tesla Owner’s manual states:

“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Other systems also have this limitation. For example the highly-rated Volvo XC90 Pilot Assist manual states: “Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed […] The driver must then intervene and apply the brakes.”

The Solution:

As of today, there is no sure-fire solution that any manufacturer has put into place to deal with suddenly-appearing stationary objects at freeway speeds. It’s up to the driver to pay attention until better hardware and software solutions are developed over time.

March 2018: Fatal Freeway Fork Barrier Collision

In March of 2018, Apple engineer Walter Huang was killed in a high-speed collision with an “under-construction” barrier at the left-exit fork of a freeway on the 101 in Mountain View California while his Tesla Model X had Autopilot mode engaged. 

2018 Model X Fatal Crash into Barrier on Highway 101 in Mountain View
Barrier at Left-Exit Fork on Highway 101 in Mountain View (repaired)

The Problem:

It’s believed Autopilot became confused by the poorly marked exit and tried to suddenly veer left onto the left freeway exit, hitting an “under construction” safety barrier that eventually led with Mr. Huang’s death at a hospital later that day.  He was apparently not paying attention and did not have his hands on the wheel at the time.

The Solution:

Since then, newer versions of the software handle freeway forks much better than before, however, these can be tricky situations so the driver should be extra attentive.

Even as recently as March 2019, one driver with a Tesla Model 3 using software update 2019.5.15 reported that his vehicle became confused with a left freeway-fork under construction and veered towards a barrier.

March 2019 (software 2019.5.15) Autopilot Freeway Fork Confusion

The bottom line is that self-driving system can become confused by unusual or complex forks or gores, especially on freeways where there are poor markings or construction. The solution is to pay attention!

Drawing Lessons From Aviation

The NTSB investigates accidents from self-driving cars as they would a plane crash, looking into the nuances of the accident to determine the cause. Because planes are years ahead of cars in terms of autonomy, the autonomous car market can learn a lot from aviation. Pilots and crew members have been responsible for not engaging hands-on operation because they were distracted or overly confident in the autopilot features. Because of this, many people advocate to skip “level 3” autonomy, where a driver takes over when the car’s autonomy can’t handle a situation, for “level 4” autonomy, which is equipped to handle situations even when a driver ignores the recommendation to take over.

The National Transportation Safety Board investigations use on-board diagnostics, bystander accounts and videos, weather patterns and other environmental data to determine exactly what cause a Tesla-vehicle accident. Tesla, Inc takes this information and applies it to future generations of autopilot hardware to minimize the chance that a similar accident happens in the future.

Tesla has said it’s working toward a fully autonomous vehicle that doesn’t require driver intervention. Elon Musk, Tesla, Inc.’s CEO projected that a level 5 vehicle would be available as early as this year. Considering driver error is such a dominating factor in previous Tesla accidents, a fully automated car could be safer than the semi-autonomous versions.

Autopilot Safer Than Humans When Used Correctly

Tesla cars are already some of the safest vehicles on the road, and when used as a driver assistance aid, Autopilot is fantastic and can help prevent accidents. However, drivers must still pay attention. These are not Level 5 hands-off and eyes-off systems, they are meant to assist drivers and not replace them.

Here’s a compilation of scenarios where Autopilot has helped prevent accidents:

Learn More