Updated March 2020
Tesla’s Autopilot and Full Self-Driving suite are some of the most advanced driver assistance technologies available to consumers today. When used correctly, they can help reduce accidents (see video at the end of this article). However, like any technology, if abused, can lead to accidents. This article outlines some of the common causes and highlights recent accidents and which bucket they fall into.
Recent Notable Autopilot Accidents:
- February 2019 – NTSB confirms that driver in fatal 2018 accident in Mountain View was not paying attention and likely on his cell phone playing a game. In addition, the driver was aware that Autopilot, at the time, needed corrections at that spot. Tesla vehicles today do not appear to have an issue with this offramp.
- January 2019 – (unconfirmed) A black Model S in Gardena, CA left the freeway at a high rate of speed and slammed into a Honda Civic at an intersection, causing fatalities. It’s not clear whether Autopilot was involved.
- December 2019 – (unconfirmed) A Model 3 rear-ended a fire truck assisting another vehicle on I-70 in Indiana, one person died later at a local hospital. While it’s unclear if the vehicle was on Autopilot at the time, the owner said he regularly uses Autopilot.
- This is an example of Scenario 1 below. At high speeds, Autopilot will not suddenly brake for stationary objects since doing so may cause more risk (especially if there is a false positive).
Tesla Autopilot Background
Tesla entered the electric car market in 2006 with the Roadster, but it wasn’t until 2014 that Tesla introduced its original Autopilot 1.0 (AP1) hardware on the Model S. Tesla’s Autopilot allows a driver to engage autonomous technology that takes over the steering, acceleration, and braking of the car. Since then the Autopilot technology has advanced rapidly, updating hardware to AP2 (see AP1 vs AP2) and had a major step-change with the Version 9 software update in late 2018 that introduced Navigate-on-Autopilot.
Tesla’s Autopilot is a “driver assist” function and the driver is instructed to remain alert with hands on the steering wheel when Autopilot is engaged. This puts Tesla’s Autopilot at Level 2 on the autonomous driving scale, as defined by SAE International (see this article about self-driving levels and what they mean).
It is by no means a hands-off and eyes-off technology, it requires the driver remain attentive at all times but does allow the car to take over most of the controls so there is less cognitive load on the driver during the trip.
Think of Tesla Autopilot as a best-in-class Distance Cruise Control and Lane Centering System, meant primarily for freeway use. However, drivers sometimes push the limits and stop paying attention which can, in certain scenarios like the ones below, cause problems.
Tesla Pushes the Envelope
Tesla is known for being an extremely innovative company that has disrupted the auto industry by creating electric cars that are not only appealing but also include advanced technology far beyond what most other cars offer (such as built-in Autopilot hardware, built-in cellular, continuous updates, etc.). It’s a car that keeps improving with software updates, even after you buy it.
One of Tesla’s hallmark features is advanced self-driving technology, known as Autopilot or Full Self-Driving. When originally introduced in 2014, known as Autopilot 1.0 (AP1), they partnered with a company called Mobileye (now owned by Intel) to provide the self-driving technology. Tesla felt Mobileye wasn’t moving fast enough and decided to roll out its own system known as Autopilot 2.0 (AP2). For more see our AP1 vs. AP2 article.
More recently Tesla has released Hardware 3 (AP3) and the Full Self-Driving option that will use this new, powerful custom-built AI hardware to even further accelerate the self-driving capabilities of Tesla vehicles equipped with this option. See the future of Autopilot during Tesla’s Autonomy Day presentation.
Unlike other auto manufacturers, Tesla rolls out software updates continuously over-the-air (like on a mobile phone), allowing consumers to have immediate access to new, advanced Autopilot updates.
While this helps advance the technology, it also means that Tesla customers must understand how to use the technology safely. Tesla clearly states that the consumer is ultimately responsible for using the technology and must remain attentive at all times. This is unlike other auto manufacturers, such as Volvo or Cadillac (see Cars with Autopilot), that have far more restrictive self-driving system, but take responsibility for accidents while those systems are active.
Typical Tesla Autopilot Crash Causes
As with any advanced driver assistance technology, there are benefits, but also dangers if used improperly. Even old-fashioned cruise control can be dangerous if used improperly. With these more advanced systems, it’s even more important that drivers pay attention, be in control of the vehicle, and not be lulled into a sense of complacency.
Most common causes for Tesla Autopilot crashes fall into these buckets:
- Hitting Stationary Objects at High Speed – When you engage Traffic Aware Cruise Control (TACC) with Autosteer at freeway speeds, Automatic Emergency Braking is reduced to avoid false-positive sudden braking (which could cause more accidents). This has been one of the biggest source of accidents as drivers often become too comfortable and stop paying attention at freeway speeds.
- Tesla’s owner’s manual states: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
- NOTE: This remains the biggest danger currently to using Autopilot – the sudden appearance of stationary objects in the car’s path at freeway speeds. Whether it’s a broken-down vehicle suddenly appearing in the path ahead or a vehicle crossing a divided highway, these are examples of why drivers must pay attention.
- Lane Incursions from Stationary Objects – Related to the issue above is the incursion or protrusion of stationary objects into the lane path. Rather than fully blocking the lane path, the object may begin to extend into the lane just enough to narrow it, such as a temporary concrete barrier.
- Since Autopilot continually predicts the lane path ahead, the sudden narrowing of a lane, especially by a small amount, often seen with poorly placed concrete barriers, may cause Autopilot to hit or bounce off those objects.
- Autopilot confusion at Forks and Gores – Before the software Version 9 update, Tesla’s Autopilot would often have issues with left-hand freeway exits or forks, often wanting to veer into those exits and potentially hitting the median barrier.
- With the Version 9 update, that has greatly improved and no longer pulls left on freeway left-hand exits as before.
- That said, when the freeway splits off to the right, vigilance is required in order to take over at a moments notice.
Typical Crash Outcomes
Luckily, Tesla vehicles are some of the safest vehicles on the road, with extremely low death and injury rates. In fact, the Model 3 is the safest vehicle ever tested by NHTSA. Original concerns over potential Lithium-Ion battery fires in electric cars have not proven to be any more dangerous than gasoline fires in regular cars (they may turn out to be even safer). The three notable Autopilot deaths were, in fact, high-speed collisions with stationary objects that, no matter what car, would not have been survivable. Hence why it’s critically important to always be paying attention at highway speeds in any car, not only Teslas.
Notable Tesla Autopilot Crashes
Hitting Stationary Objects at High Speeds
The most infamous fatal crashes have been those involving tractor-trailer trucks crossing divided highways. There have been two such incidents, both in Florida, both incredibly similar, three years apart. We’ll cover them both there.
May 2016: First Fatal Tractor-Trailer Crash with AP1
The first confirmed Tesla Autopilot fatality occurred in Gainesville Florida on May 7, 2016 when 40 year-old Joshua Brown, driving his AP1 Model S on Autopilot, struck a tractor-trailer crossing a divided highway. The Model S hit the tractor trailer straight on without slowing down and went under the trailer, shearing off the roof Model S before coming to a stop hundreds of feet later.
After an extensive investigation by the National Highway Traffic Safety Administration (NHTSA) and Tesla it was determined that Mr. Brown was not paying attention as the Autopilot system gave several warnings beforehand for the driver to put their hands back on the wheel.
The NTSB also reported that Tesla’s software was missing key safeguards that could have prevented the crash by limiting the driver’s use of the Autopilot feature during conditions in which it was not safe to use.
This scenario was difficult for Autopilot to handle, for a couple reasons. First, the cameras had a difficult time distinguishing the white tractor-trailer against the bright sky. Second, the AP1 system, at the time, was tuned to ignore more of the radar inputs at high speed in order to avoid false positives.
After the accident, Tesla reportedly adjusted the sensitivity of the system to better detect these scenarios, stating, “The changes include refinements in Autopilot’s radar that improve its ability to spot and identify obstacles down the road and additional warnings to force drivers to keep their hands on the steering wheel and eyes on the road while the system is active.” However, this was with AP1 at the time, and they’ve since switched to their own system (what is AP1 vs. AP2?).
March 2019: Second Fatal Tractor-Trailer Crash with AP2
After the first tractor-trailer crash in 2016, most people assumed that Tesla would have solved that particular edge case, particularly since several years have gone by. However, in March of 2019, an almost identical crash occurred with a Model 3 using AP2 hardware, shearing off the roof of the Model 3 as it went under the tractor-trailer. According to the NTSB, the system did not detect the driver’s hands on the wheel seconds prior to impact, “Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.”
So that begs the question, why didn’t Autopilot detect the tractor-trailer when this was a known issue prior? Most likely it’s because AP2 was being utilized instead of AP1 (see AP1 vs AP2). AP2 uses Tesla’s own in-house AI / neural network to detect and react to objects, that they built from scratch. However, given that this same scenario as occurred before, one would think Tesla had already trained the neural network to handle these situations. Apparently not.
It has also again raised the question whether use of more accurate perception technologies such as LiDAR (see LiDAR vs Cameras) would help or the use of HD Maps to let the system know there should be no large objects in the path at these locations (like a bride or otherwise).
Hopefully Tesla will move quickly to handle this edge case and prove to the world, once and for all, Autopilot detect these objects and react.
Stationary or Slow-Moving Cars in Lane
Another common issue with stationary objects is stationary or slow-moving objects in the lane, such as broken down cars, emergency vehicles or street cleaners. These can be especially tricky since moving cars ahead may suddenly switch lanes revealing the object or car with little time to react. See Why Do Teslas Hit Stopped Firetrucks and Emergency Vehicles? for more.
January 2018: Tesla Hits Stopped Fire Truck
A Tesla Model S crashed into the back of a firetruck on a Los Angeles highway at 65 miles per hour in 2018. Luckily, no injuries were sustained. A car that was traveling in front of the Tesla merged into another lane, and the stationary firetruck was undetected by the Tesla’s monitoring system (or didn’t slow down fast enough – it’s still unclear).
Since then additional Teslas on Autopilot have hit firetrucks in this type of scenario since emergency vehicles often are parked in an active lane in order to deal with an accident.
This Tesla crash demonstrates the problem of dealing with suddenly-appearing stationary objects at high speeds and how it remains one of the biggest safety challenges for Autopilot. On the one hand, it could be very dangerous if Autopilot suddenly slammed on the brakes at freeway speeds if it falsely thinks there is a stationary object ahead of it (causing rear-end collisions, etc.). On the other hand, there should be some sort of warning for drivers, at least. Either way, it’s critical to pay attention to the road when driving at freeway speeds on Autopilot.
Here’s what the Tesla Owner’s manual states:
“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
Other cars with autopilot also have this limitation. For example, the highly-rated Volvo XC90 Pilot Assist manual states: “[Volvo] Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed […] The driver must then intervene and apply the brakes.”
This is because the radar used in most automotive applications is designed to detect moving objects (for dynamic cruise control). Manufacturers like Tesla need to fuse the data from the radar and cameras to provide a very high level of confidence that there is a station object protruding in the lane and take action (or warn).
As of today, there is no sure-fire solution that any manufacturer has put into place to deal with suddenly-appearing stationary objects at freeway speeds. It’s up to the driver to pay attention until better hardware and software solutions are developed over time.
January 2016: Fatal Crash in China
In January of 2016, Gao Yaning, a 23-year-old Chinese man, was killed when the Tesla Model S he was driving crashed into a street sweeper. An in-car camera captured footage that shows the car merging into the left lane before hitting the road-cleaning truck.
There is no evidence suggesting the brakes were applied prior to impact. Unfortunately, the car sustained extensive damage and was unable to transmit on-board data. For this reason, it is unclear if Autopilot was engaged.
The bottom line, if you’re driving on a highway where there is the possibility of cross traffic suddenly appearing, you must stay vigilant at all times.
Lane Incursions from Stationary Objects
February 2017: Concrete Barrier Sideswipe
Video footage of an AP1 Tesla Model S crashing into a highway barrier in 2017 shows the car’s inability at the time to adapt to a change in road conditions. It was determined that the driver did not have his hands on the wheel at the time of the accident, and was not able to take control of the system when needed. The Tesla failed to merge in a construction zone, sideswiping the barrier.
The Tesla Model S is designed to stay between the road lines using cameras and sensors, but because the lines continued beyond the barrier, the car’s systems were unable to avoid a collision. The driver was at fault for not remaining alert enough to take control of the system to avoid the road hazard.
New AP2 hardware has improved sensors, and more importantly, the machine learning software continually improves over time, better detecting scenarios like this.
Other Concrete Barrier Near-Misses
While Autopilot continues to improve concrete barrier lane incursions are still a tricky situation since lane markings and the temporary concrete barrier provide conflicting signals to the perception system. Here’s another, more recent example:
The bottom line is that you need to pay attention when there are temporary lane barriers that adjust lanes from the original path.
Autopilot confusion at Forks and Gores
March 2018: Fatal Freeway Fork Barrier Collision
In March of 2018, Apple engineer Walter Huang was killed in a high-speed collision with an “under-construction” barrier at the left-exit fork of a freeway on the 101 in Mountain View California while his Tesla Model X had Autopilot mode engaged.
It’s believed Autopilot became confused by the poorly marked exit and tried to suddenly veer left onto the left freeway exit, hitting an “under construction” safety barrier that eventually led with Mr. Huang’s death at a hospital later that day. He was apparently playing a game on his phone and not paying attention, nor did he have his hands on the wheel at the time.
Since then, newer versions of the software handle freeway forks much better than before, however, these can be tricky situations so the driver should be extra attentive.
Even as recently as March 2019, one driver with a Tesla Model 3 using software update 2019.5.15 reported that his vehicle became confused with a left freeway-fork under construction and veered towards a barrier.
The bottom line is that self-driving system can become confused by unusual or complex forks or gores, especially on freeways where there are poor markings or construction. The solution is to pay attention!
Drawing Lessons From Aviation
The NTSB investigates accidents from self-driving cars as they would a plane crash, looking into the nuances of the accident to determine the cause. Because planes are years ahead of cars in terms of autonomy, the autonomous car market can learn a lot from aviation. Pilots and crew members have been responsible for not engaging hands-on operation because they were distracted or overly confident in the autopilot features. Because of this, many people advocate to skip “level 3” autonomy, where a driver takes over when the car’s autonomy can’t handle a situation, for “level 4” autonomy, which is equipped to handle situations even when a driver ignores the recommendation to take over.
The National Transportation Safety Board investigations use on-board diagnostics, bystander accounts and videos, weather patterns and other environmental data to determine exactly what cause a Tesla-vehicle accident. Tesla, Inc takes this information and applies it to future generations of autopilot hardware to minimize the chance that a similar accident happens in the future.
Tesla has said it’s working toward a fully autonomous vehicle that doesn’t require driver intervention. Elon Musk, Tesla, Inc.’s CEO projected that a level 5 vehicle would be available as early as this year. Considering driver error is such a dominating factor in previous Tesla accidents, a fully automated car could be safer than the semi-autonomous versions.
Autopilot Safer Than Humans When Used Correctly
Tesla cars are already some of the safest vehicles on the road, and when used as a driver assistance aid, Autopilot is fantastic and can help prevent accidents. However, drivers must still pay attention. These are NOT Level 5 hands-off and eyes-off systems, they are meant to assist drivers and not replace them.
See how the Model X and Autopilot scored one of the highest ratings ever (about 2 minutes in):
Bottom Line – Autopilot Safety Tips
The bottom line is to always pay attention and keep your hands on the wheel. Autopilot is by no means a perfect system and should be treated as a “driver assist” technology rather than “hands-off”. Here are some important safety tips:
- Know what’s ahead – make sure there are no stationary objects up ahead blocking the roadway.
- Keep your hands on the wheel – this is extremely important, even if you are looking away temporarily since, if you feel Autopilot wanting to make a sudden move, you can quickly override it.
- Don’t use in construction or temporary lanes – as shown above with temporary concrete barriers, this can be confusing to Autopilot.
- Avoid using with cross traffic – Autopilot should only be used on closed-access highways with dedicated on-ramps and off-ramps. Don’t use it on road with intersections (see Tractor-Trailer accidents above), or if you do, be especially vigilant.