Tesla Autopilot Crashes and Causes

Tesla’s Autopilot and Full Self-Driving suite are some of the most advanced driver assistance technologies available to consumers today. When used correctly, they can help reduce accidents (see video at the end of this article). However, it’s important to understand that this technology is not fully autonomous and still requires an attentive driver. Like any technology, if abused, can lead to accidents. This article outlines some of the common causes and highlights recent accidents and which bucket they fall into.

Notable Autopilot Accidents:

  • February 2019 – NTSB confirms that driver in fatal 2018 accident in Mountain View was not paying attention and likely on his cell phone playing a game. In addition, the driver was aware that Autopilot, at the time, needed corrections at that spot. Tesla vehicles today do not appear to have an issue with this offramp.
  • December 2019 – (unconfirmed) A Model 3 rear-ended a fire truck assisting another vehicle on I-70 in Indiana, one person died later at a local hospital. While it’s unclear if the vehicle was on Autopilot at the time, the owner said he regularly uses Autopilot.
    • This is an example of Scenario 1 below. At high speeds, Autopilot will not suddenly brake for stationary objects since doing so may cause more risk (especially if there is a false positive).
  • April 2021 – A Tesla Model S with no one in the driver’s seat crashes into a tree in Texas, killing both occupants (one in the passenger seat and one in the back – no one in the driver’s seat as required). It’s assumed Autopilot was engaged but unclear how the safety system was bypassed, which normally checks for a driver in the seat. Tesla claims the over-the-air data didn’t show Autopilot usage, but it’s not clear whether that includes the time right before the incident.
  • August 2021 – National Highway Traffic Safety Administration (NHTSA) opens an investigation into why Tesla vehicles crash into stationary objects (see below). Soon afterwards, another Tesla vehicle hits a parked police car on the side of the highway in Orlando, Florida. (see causes below).

Tesla Autopilot Background

Tesla entered the electric car market in 2006 with the Roadster, but it wasn’t until 2014 that Tesla introduced its original Autopilot 1.0 (AP1) hardware on the Model S. Tesla’s Autopilot allows a driver to engage autonomous technology that takes over the steering, acceleration, and braking of the car.  Since then the Autopilot technology has advanced rapidly, updating hardware to AP2 (see AP1 vs AP2) and had a major step-change with the Version 9 software update in late 2018 that introduced Navigate-on-Autopilot.

Tesla’s Autopilot is a “driver assist” function and the driver is instructed to remain alert with hands on the steering wheel when Autopilot is engaged. This puts Tesla’s Autopilot at Level 2 on the self-driving driving scale, as defined by SAE International (see this article about self-driving levels and what they mean).

It is by no means a hands-off and eyes-off technology, it requires the driver to remain attentive at all times but does allow the car to take over most of the controls so there is less cognitive load on the driver during the trip.

Think of Tesla Autopilot as a best-in-class Distance Cruise Control and Lane Centering System, meant primarily for freeway use. However, drivers sometimes push the limits and stop paying attention which can, in certain scenarios like the ones below, cause problems.

Tesla Pushes the Envelope

Tesla is known for being an extremely innovative company that has disrupted the auto industry by creating electric cars that are not only appealing but also include advanced technology far beyond what most other cars offer (such as built-in Autopilot hardware, built-in cellular, continuous updates, etc.).  It’s a car that keeps improving with software updates, even after you buy it.

One of Tesla’s hallmark features is advanced self-driving technology, known as Autopilot or Full Self-Driving. When originally introduced in 2014, known as Autopilot 1.0 (AP1), they partnered with a company called Mobileye (now owned by Intel) to provide the self-driving technology.  Tesla felt Mobileye wasn’t moving fast enough and decided to roll out its own system known as Autopilot 2.0 (AP2). For more see our AP1 vs. AP2 article.

More recently Tesla has released Hardware 3 (AP3) and the Full Self-Driving option that will use this new, powerful custom-built AI hardware to even further accelerate the self-driving capabilities of Tesla vehicles equipped with this option. See the future of Autopilot during Tesla’s Autonomy Day presentation.

Unlike other auto manufacturers, Tesla rolls out software updates continuously over-the-air (like on a mobile phone), allowing consumers to have immediate access to new, advanced Autopilot updates.

While this helps advance the technology, it also means that Tesla customers must understand how to use the technology safely.  Tesla clearly states that the consumer is ultimately responsible for using the technology and must remain attentive at all times – these systems are not fully autonomous.  This is unlike other auto manufacturers, such as Volvo or Cadillac (see Cars with Autopilot), that have far more restrictive self-driving system, but take responsibility for accidents while those systems are active.

Typical Tesla Autopilot Crash Causes

As with any advanced driver assistance technology, there are benefits, but also dangers if used improperly.  Even old-fashioned cruise control can be dangerous if used improperly.  With these more advanced systems, it’s even more important that drivers pay attention, be in control of the vehicle, and not be lulled into a sense of complacency.

Most common causes for Tesla Autopilot crashes fall into these buckets:

  1. Hitting Stationary Objects at High Speed – When you engage Traffic Aware Cruise Control (TACC) with Autosteer at freeway speeds, Automatic Emergency Braking is reduced to avoid false-positive sudden braking (which could cause more accidents).  This has been one of the biggest source of accidents as drivers often become too comfortable and stop paying attention at freeway speeds.
    • Tesla’s owner’s manual states: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
    • NOTE: This remains the biggest danger currently to using Autopilot – the sudden appearance of stationary objects in the car’s path at freeway speeds. Whether it’s a broken-down vehicle suddenly appearing in the path ahead or a vehicle crossing a divided highway, these are examples of why drivers must pay attention.
  2. Lane Incursions from Stationary Objects – Related to the issue above is the incursion or protrusion of stationary objects into the lane path. Rather than fully blocking the lane path, the object may begin to extend into the lane just enough to narrow it, such as a temporary concrete barrier.
    • Since Autopilot continually predicts the lane path ahead, the sudden narrowing of a lane, especially by a small amount, often seen with poorly placed concrete barriers, may cause Autopilot to hit or bounce off those objects.
  3. Autopilot confusion at Forks and Gores – Before the software Version 9 update, Tesla’s Autopilot would often have issues with left-hand freeway exits or forks, often wanting to veer into those exits and potentially hitting the median barrier.
    • With the Version 9 update, that has greatly improved and no longer pulls left on freeway left-hand exits as before.
    • That said, when the freeway splits off to the right, vigilance is required in order to take over at a moments notice.

See this video from the Wall Street Journal as well:

Typical Crash Outcomes

Luckily, Tesla vehicles are some of the safest vehicles on the road, with extremely low death and injury rates. In fact, the Model 3 is the safest vehicle ever tested by NHTSA. Original concerns over potential Lithium-Ion battery fires in electric cars have not proven to be any more dangerous than gasoline fires in regular cars (they may turn out to be even safer).  The three notable Autopilot deaths were, in fact, high-speed collisions with stationary objects that, no matter what car, would not have been survivable. Hence why it’s critically important to always be paying attention at highway speeds in any car, not only Teslas.

Notable Tesla Autopilot Crashes

Hitting Stationary Objects at High Speeds

Tractor-Trailer Crashes

The most infamous fatal crashes have been those involving tractor-trailer trucks crossing divided highways. There have been two such incidents, both in Florida, both incredibly similar, three years apart. We’ll cover them both there.

Tractor Trailer Semi Truck Autopilot Accident
Side View of Tractor-Trailer Truck

May 2016: First Fatal Tractor-Trailer Crash with AP1

The first confirmed Tesla Autopilot fatality occurred in Gainesville Florida on May 7, 2016 when 40 year-old Joshua Brown, driving his AP1 Model S on Autopilot, struck a tractor-trailer crossing a divided highway. The Model S hit the tractor trailer straight on without slowing down and went under the trailer, shearing off the roof Model S before coming to a stop hundreds of feet later.

After an extensive investigation by the National Highway Traffic Safety Administration (NHTSA) and Tesla, it was determined that Mr. Brown was not paying attention as the Autopilot system gave several warnings beforehand for the driver to put their hands back on the wheel.

The NTSB also reported that Tesla’s software was missing key safeguards that could have prevented the crash by limiting the driver’s use of the Autopilot feature during conditions in which it was not safe to use.

The Problem:

This scenario was difficult for Autopilot to handle, for a couple reasons. First, the cameras had a difficult time distinguishing the white tractor-trailer against the bright sky. Second, the AP1 system, at the time, was tuned to ignore more of the radar inputs at high speed in order to avoid false positives.

The Solution:

One of the reasons why Tesla (and other vehicles) do not suddenly brake at highways speeds is that the sudden braking may cause additional accidents (such as rear-ends, etc.). In addition, previously, radar wasn’t accurate enough to distinguish objects, which could result in erroneous data. In 2021, Tesla removed radar from the inputs and started relying on vision only (called Tesla Vision). In theory, this would remove the radar system (which is less detailed than vision), resulting in better perception. That said, it remains to be seen if that’s the case.

March 2019: Second Fatal Tractor-Trailer Crash with AP2

After the first tractor-trailer crash in 2016, most people assumed that Tesla would have solved that particular edge case, particularly since several years have gone by. However, in March of 2019, an almost identical crash occurred with a Model 3 using AP2 hardware, shearing off the roof of the Model 3 as it went under the tractor-trailer. According to the NTSB, the system did not detect the driver’s hands on the wheel seconds prior to impact, “Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.”

So that begs the question, why didn’t Autopilot detect the tractor-trailer when this was a known issue prior? Most likely it’s because AP2 was being utilized instead of AP1 (see AP1 vs AP2). AP2 uses Tesla’s own in-house AI / neural network to detect and react to objects, that they built from scratch. However, given that this same scenario as occurred before, one would think Tesla had already trained the neural network to handle these situations. Apparently not.

It has also again raised the question whether use of more accurate perception technologies such as LiDAR (see LiDAR vs Cameras) would help or the use of HD Maps to let the system know there should be no large objects in the path at these locations (like a bride or otherwise).

Hopefully Tesla will move quickly to handle this edge case and prove to the world, once and for all, Autopilot detect these objects and react.

Stationary or Slow-Moving Cars in Lane

Another common issue with stationary objects is stationary or slow-moving objects in the lane, such as broken down cars, emergency vehicles or street cleaners. These can be especially tricky since moving cars ahead may suddenly switch lanes revealing the object or car with little time to react. See Why Do Teslas Hit Stopped Firetrucks and Emergency Vehicles? for more.

January 2018: Tesla Hits Stopped Fire Truck

A Tesla Model S crashed into the back of a firetruck on a Los Angeles highway at 65 miles per hour in 2018. Luckily, no injuries were sustained. A car that was traveling in front of the Tesla merged into another lane, and the stationary firetruck was undetected by the Tesla’s monitoring system (or didn’t slow down fast enough – it’s still unclear).

Since then additional Teslas on Autopilot have hit firetrucks in this type of scenario since emergency vehicles often are parked in an active lane in order to deal with an accident.

The Problem:

This Tesla crash demonstrates the problem of dealing with suddenly-appearing stationary objects at high speeds and how it remains one of the biggest safety challenges for Autopilot. On the one hand, it could be very dangerous if Autopilot suddenly slammed on the brakes at freeway speeds if it falsely thinks there is a stationary object ahead of it (causing rear-end collisions, etc.).  On the other hand, there should be some sort of warning for drivers, at least.  Either way, it’s critical to pay attention to the road when driving at freeway speeds on Autopilot.

Here’s what the Tesla Owner’s manual states:

“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

Other cars with autopilot also have this limitation. For example, the highly-rated Volvo XC90 Pilot Assist manual states: “[Volvo] Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed […] The driver must then intervene and apply the brakes.”

This is because the radar used in most automotive applications is designed to detect moving objects (for dynamic cruise control). Manufacturers like Tesla need to fuse the data from the radar and cameras to provide a very high level of confidence that there is a station object protruding in the lane and take action (or warn).

The Solution:

As of today, there is no sure-fire solution that any manufacturer has put into place to deal with suddenly-appearing stationary objects at freeway speeds. Tesla has removed the less detailed radar system to focus on vision (called Tesla Vision), but it remains to be seen if that’s more accurate and can handle these situations. It’s up to the driver to pay attention until better hardware and software solutions are developed over time.

January 2016: Fatal Crash in China

In January of 2016, Gao Yaning, a 23-year-old Chinese man, was killed when the Tesla Model S he was driving crashed into a street sweeper. An in-car camera captured footage that shows the car merging into the left lane before hitting the road-cleaning truck.

Dashcam Image from Gao Yaning Crash Before Impact

There is no evidence suggesting the brakes were applied prior to impact. Unfortunately, the car sustained extensive damage and was unable to transmit on-board data. For this reason, it is unclear if Autopilot was engaged.

The bottom line, if you’re driving on a highway where there is the possibility of cross traffic suddenly appearing, you must stay vigilant at all times.

Lane Incursions from Stationary Objects

February 2017: Concrete Barrier Sideswipe

Video footage of an AP1 Tesla Model S crashing into a highway barrier in 2017 shows the car’s inability at the time to adapt to a change in road conditions. It was determined that the driver did not have his hands on the wheel at the time of the accident, and was not able to take control of the system when needed. The Tesla failed to merge in a construction zone, sideswiping the barrier.

The Problem:

The Tesla Model S is designed to stay between the road lines using cameras and sensors, but because the lines continued beyond the barrier, the car’s systems were unable to avoid a collision. The driver was at fault for not remaining alert enough to take control of the system to avoid the road hazard.

The Solution:

New AP2 hardware has improved sensors, and more importantly, the machine learning software continually improves over time, better detecting scenarios like this.

Other Concrete Barrier Near-Misses

While Autopilot continues to improve concrete barrier lane incursions are still a tricky situation since lane markings and the temporary concrete barrier provide conflicting signals to the perception system. Here’s another, more recent example:

November 2019

The bottom line is that you need to pay attention when there are temporary lane barriers that adjust lanes from the original path.

Autopilot confusion at Forks and Gores

March 2018: Fatal Freeway Fork Barrier Collision

In March of 2018, Apple engineer Walter Huang was killed in a high-speed collision with an “under-construction” barrier at the left-exit fork of a freeway on the 101 in Mountain View California while his Tesla Model X had Autopilot mode engaged. 

2018 Model X Fatal Crash into Barrier on Highway 101 in Mountain View
Barrier at Left-Exit Fork on Highway 101 in Mountain View (repaired)

The Problem:

It’s believed Autopilot became confused by the poorly marked exit and tried to suddenly veer left onto the left freeway exit, hitting an “under construction” safety barrier that eventually led with Mr. Huang’s death at a hospital later that day.  He was apparently playing a game on his phone and not paying attention, nor did he have his hands on the wheel at the time.

The Solution:

Since then, newer versions of the software handle freeway forks much better than before, however, these can be tricky situations so the driver should be extra attentive.

Even as recently as March 2019, one driver with a Tesla Model 3 using software update 2019.5.15 reported that his vehicle became confused with a left freeway-fork under construction and veered towards a barrier.

March 2019 (software 2019.5.15) Autopilot Freeway Fork Confusion

The bottom line is that self-driving system can become confused by unusual or complex forks or gores, especially on freeways where there are poor markings or construction. The solution is to pay attention!

Drawing Lessons From Aviation

The NTSB investigates accidents from self-driving cars as they would a plane crash, looking into the nuances of the accident to determine the cause. Because planes are years ahead of cars in terms of autonomy, the autonomous car market can learn a lot from aviation. Pilots and crew members have been responsible for not engaging hands-on operation because they were distracted or overly confident in the autopilot features. Because of this, many people advocate to skip “level 3” autonomy, where a driver takes over when the car’s autonomy can’t handle a situation, for “level 4” autonomy, which is equipped to handle situations even when a driver ignores the recommendation to take over.

The National Transportation Safety Board investigations use on-board diagnostics, bystander accounts and videos, weather patterns and other environmental data to determine exactly what cause a Tesla-vehicle accident. Tesla, Inc takes this information and applies it to future generations of autopilot hardware to minimize the chance that a similar accident happens in the future.

Tesla has said it’s working toward a fully autonomous vehicle that doesn’t require driver intervention. Elon Musk, Tesla, Inc.’s CEO projected that a level 5 vehicle would be available as early as this year. Considering driver error is such a dominating factor in previous Tesla accidents, a fully automated car could be safer than the semi-autonomous versions.

Autopilot Safer Than Humans When Used Correctly

Tesla cars are already some of the safest vehicles on the road, and when used as a driver assistance aid, Autopilot is fantastic and can help prevent accidents. However, drivers must still pay attention. These are NOT Level 5 hands-off and eyes-off systems, they are meant to assist drivers and not replace them.

See how the Model X and Autopilot scored one of the highest ratings ever (about 2 minutes in):

Bottom Line – Autopilot Safety Tips

The bottom line is to always pay attention and keep your hands on the wheel. Autopilot is by no means a perfect system and should be treated as a “driver assist” technology rather than “hands-off”. Here are some important safety tips:

  • Know what’s ahead – make sure there are no stationary objects up ahead blocking the roadway.
  • Keep your hands on the wheel – this is extremely important, even if you are looking away temporarily since, if you feel Autopilot wanting to make a sudden move, you can quickly override it.
  • Don’t use in construction or temporary lanes – as shown above with temporary concrete barriers, this can be confusing to Autopilot.
  • Avoid using with cross traffic – Autopilot should only be used on closed-access highways with dedicated on-ramps and off-ramps. Don’t use it on road with intersections (see Tractor-Trailer accidents above), or if you do, be especially vigilant.

Learn More

FOLLOW US

Most Popular