It seems like every year we hear about a Tesla crashing into a parked emergency vehicle on the freeway, like a firetruck helping another car on the side of the road.
This typically occurs when the emergency vehicle is protruding or blocking the lane in which the Tesla is traveling and the driver has the vehicle on Autopilot and is not paying attention. When traveling at freeway speeds, Tesla’s Autopilot will not slam on the brakes if there is a suddenly-appearing stationary object.
From the Tesla owner’s manual:
“Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
Not Unique to Tesla
This is not unique to Tesla. Other manufacturers that sell cars with autopilot, also do not slam on the brakes for suddenly appearing objects at high speed since that may cause an accident with other cars behind rear-ending the vehicle upon sudden deceleration.
For example, the highly-rated Volvo XC90 Pilot Assist manual states: “[Volvo] Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed […] The driver must then intervene and apply the brakes.”
Why Doesn’t Autopilot React to Stationary Objects?
Well, actually, it does – just not at high speeds. Autopilot will react and respond to stationary objects below 50 mph. The problem is typically at freeway speeds where quickly responding to a stationary object carries greater risk (for example, other cars rear-ending you when you slow down quickly, etc.). So false positives are just too risky for most cars to respond at freeway speeds currently.
Radar and Cameras
The issue comes down to the perception of the environment. Most consumer cars with autopilot today use a forward-facing radar to detect large objects, such as cars and cameras for additional input. The challenge is that automotive radar, built by third-party providers such as Bosch and Continental were meant to be used for applications such as Adaptive Cruise Control (ACC) where you have keeping distance from other moving objects.
Traditional Automotive Radar
These typical radar units used in most cars today were not meant to detect stationary objects, especially those that are partially in the lane. That’s why many self-driving companies use LiDAR instead since it has higher fidelity in terms of detecting object placement. The downside is LiDAR is very expensive and many, like Elon Musk, believe software, particularly neural networks can do just a good a job as LiDAR, if not better (see LiDAR vs Cameras for Self Driving).
Even so, Tesla upgraded its radar in their vehicles over time for better range and precision, moving from a Bosch radar unit in AP1 and AP2 cars to Continential in the current vehicles (see AP1 vs AP2 vs AP3).
Sensor Fusion
In order to make sense of what’s happening around the car, manufacturers like Tesla must create their own software to do what’s called “sensor fusion”, that is, combining the data from radar and cameras, and then make a decision around what to do next.

As the neural network becomes better at detecting objects using both radar and cameras, Tesla will be better able to handle these scenarios, but as mentioned earlier, there is a far higher risk of false positives at freeway speeds, so the software must be very certain there is a threat before reacting.
Improved Radar Units?
For some time Tesla was looking to upgrade its forward-facing radar, by moving to a fifth-generation radar (such as this one from Continental) or there were even rumors they Tesla was building its own radar in-house, led by Pete Bannon, who also developed the Full Self-Driving Computer (aka Hardware 3).
By improving the radar precision and range, the software will have higher confidence in object detection and have to do less guesswork whether a station object is a threat or not – something critical at highway speeds.
Tesla Vision – No More Radar
In mid-2021, Tesla’s vision AI algorithms had gotten to the point where they can detect objects with higher accuracy than radar. Since radar typically is a very ‘coarse’ detection tool and can be somewhat ‘trigger happy’ in detecting various objects, vision alone (like human vision) can often do a better job detecting discrete objects.
However, accurate vision detection requires powerful AI to correctly detect and classify objects. In 2021, Tesla announced it had reached the point where it could use cameras and vision alone to run Autopilot, dubbed Tesla Vision.
In the case of parked vehicles on the side of the highway, Andrej Karpathy, head of AI at Tesla demonstrated at a talk in 2021, how they can now detect stationary objects on the side of the road better than vision and radar fusion.

Bottom Line
Using an autopilot system requires its own set of vigilance since these systems are not yet fully autonomous, self-driving systems. They are Level 2 systems (what do SAE Self-Driving Levels mean?). Using these systems incorrectly will lead to an accident (see Tesla Autopilot Crashes and Causes).
That said, these types of stationary objects and scenarios are something fully autonomous vehicles should readily handle and if Tesla intends to have autonomous Full Self Driving as they announced during Tesla Autonomy Day in 2019, then we should see Tesla vehicles detect and handle these situations in 2021. Time will tell!
Learn More