Tesla currently offers three autopilot tiers: standard Autopilot, Enhanced Autopilot and Full Self-Driving (FSD). Autopilot offers basic driver assistance functions including distance cruise control and lane centering functions, while Enhanced Autopilot, an optional paid feature offers automated lane changes, Navigation on Autopilot, Summon and more. Finally Full Self-Driving, the most expensive package, offers stop sign and traffic light controls plus navigation on city streets. While Tesla call their features “Autopilot” and “Full Self Driving”, they are not fully autonomous and require that you pay attention at all times.
You can read more about the differences in detail in our Autopilot vs Full Self-Driving review.
Tesla Autonomous Driving Progress
In this article, we chronicle the progress of Tesla’s Full Self-Driving capabilities towards full autonomy. Elon Musk has pushed his team to make rapid progress in the area and even claimed during the Autonomy Investor Day back in April 2019 that autonomous Full Self-Driving would be ‘feature complete’ by the end of that year. While that didn’t happen, they did release a Full Self-Driving ‘sneak preview’ that showcased how the Full Self-Driving Computer (aka Hardware 3) can now recognize and visualize road markings.
That said, Tesla has consistently pushed the envelope in consumer autonomous driving technology while auto manufacturers have struggled to determine the best path forward, who to partner with for technology or even whether to use LiDAR vs cameras for perception. Yes there are cars with autopilot features, but no one to date, has matched Tesla’s progress.
Autonomy Day Full Self-Driving Demo
Back in 2019, Tesla treated the media and investors to the first glimpse of a fully autonomous drive in a Model 3 using pre-release development code. The vehicle performed a loop around their Palo Alto headquarters covering both freeway and city routes while the driver had the hands fully off the wheel. They even released a video (see below).
Autonomy Day Demo vs Reality
While the 2019 Autonomy Day self-driving demo is impressive, we decided to drive the same route using the Full Self-Driving ‘sneak preview’ provided by Tesla at the end of December 2019 (software update 2019.40.50.1) to see how far the public release has come, particularly in comparison to the demo.
Here are the results with summary below:
Full Self-Driving Autonomous Driving Capabilities
Detailed below is the current state of some key elements needed for autonomous full self-driving and their status with Tesla’s Full Self-Driving as available today.
FSD is NOT Autonomous nor Driverless
First off, the current Tesla driver assistance features such as Autopilot, Enhanced Autopilot or Full Self-Driving are not actually driverless. They require that the driver always pay attention and keep their hands on the wheel. This is especially important since the system may at times make sudden lane adjustments that are not anticipated so hands on the wheel to detect and correct these movements is critical.
Autosteer – Excellent but Not Autonomous
First off, the good news. On the freeway, and even city streets, Tesla’s dynamic cruise control, lane centering (Autosteer), traffic control handling, and automated lane changes are far beyond what other cars with autopilot offer. With the new visualizations and Autopilot improvements released in 2019, closed-access freeway driving is very close to the Tesla Autonomy Day driving demo, especially if you have the Full Self-Driving package with Hardware 3. That said there are still too many rough edges where a human still needs to pay attention, including lane changes, merges, construction zones, and sudden stationary objects appearing in the road (see Autopilot Crashes and Causes).
The Navigate on Autopilot feature included with the Full Self-Driving option allows drivers to navigate from one point to another, fully autonomously with driver supervision. It will navigate freeway interchanges and automatically exit the freeway as well.
City Driving – Tesla’s Big Challenge
As Elon already stated during Autonomy Day, autonomous city driving is the next frontier that Tesla is focused on to achieve true Full Self-Driving. City driving is especially challenging given the variety of road, signals, traffic and pedestrians, etc. occupying the same space in an almost infinite number of situations. We break down various components of city driving below.
Perceiving the Environment
Critical to autonomous driving is environmental perception and understanding various objects. As shown here, there are a multitude of objects the Tesla neural network must perceive and understand, including both stationary and moving objects.
Drivable Space Perception
The first step in city driving has been to confidently understand lane and curb markings to ensure the vehicle knows what is drivable space. This is fairly good and far better than other cars with autopilot can achieve.
Example of Drivable Space visualization (from Tesla’s demo vehicle on right):
Perceiving Road Markings, Street Signs and Signals
Critical to autonomous driving in an urban environment is the perception and reaction to road marketing, street signs and signals. With the Full Self-Driving ‘Sneak Preview’ we got a first glimpse into how the Hardware 3 (the Full Self-Driving Computer) is utilizing the neural network to classify and display the road markings and signs it recognizes.
Now, with the Traffic Light and Stop Sign Control feature released in mid-2020, Tesla vehicles with the optional Full Self-Driving package can respond to traffic controls on city streets such as traffic lights and stop signs.
Perceiving Other Moving Objects
Another crucial step in city driving is the perception of surrounding objects and correctly categorizing them, including other vehicles, cyclists, pedestrians, etc. In addition to identifying those objects the system must also predict their path in order to react appropriately, requiring a notion of how those objects are moving in time.
Currently the Tesla system seems to recognize objects fairly consistently, including cyclists, pedestrians, etc. However, it does not yet react to them and expects the driver to take control most of the time, especially in city driving. Additionally, Tesla Autopilot will not take action if there is an object within the lane at highway speeds (see Autopilot Crashes and Causes).
Here’s a great example of an autonomous driving startup, called Zoox, and how they recognize objects and pedestrians:
Being able to navigate through an intersection is already hard enough for humans, and even more so for autopilot and self-driving systems, currently. While the Tesla Autonomy Day demo seemed to handle very basic and simple intersections, current Full Self-Driving has improved quite a bit and is even able to handle some roundabouts.
Below you can see how the Tesla Autonomy Day vehicle sees the intersection via drivable space.
Taking Sharp Turns and Roundabouts
Handling turns, especially sharp turn and those turns that tighten as they progress can be challenging for autopilot systems as they must anticipate the turn and slow the car down appropriately. Most cars with autopilot do not handle this situation well, however, Tesla’s Autopilot does a very admirable job, doing far better than most, however, still not perfect and requiring driver attention.
Here’s a particularly tricky turn that was used during Autonomy Day. This turn is taken from highway 280 North in California onto Sand Hill Road West (current Autopilot versions often disengage on the tail end of this turn).
Parking Lot Navigation
Another very challenging situation is parking lots. These areas are often congested with cars and pedestrians in addition to having very tight maneuvering spaces.
Currently Tesla offers the “Smart Summon” feature of the Full Self-Driving package that allows Tesla vehicles to drive themselves to the owner who is standing nearby.
Tesla also showcased parking lot navigation way back in 2016 with its first autonomous driving video, the famous “Paint it Black” video shown below (see more Tesla autonomous demonstration videos here):
It remains to be seen how well parking can be integrated into the autonomous experience by bridging the current Smart Summon feature with Navigate on Autopilot.
Tesla and Elon Musk have placed a big target on delivering autonomous Full Self-Driving and even a Tesla RoboTaxi service. By most accounts, that’s extremely ambitious and it doesn’t seem close-at-hand… yet.
That said, the latest Full Self-Driving Beta is showing some impressive improvements. Now with Hardware 4 having launched in 2023 we should continue to see a steady pace of improvements to the Autopilot and Full Self-Driving experience. However, it’s important to remember, it’s likely still years away from being truly autonomous.