Major Autopilot Improvements in Version 9 Update

Tesla has released a major update to its self-driving Autopilot system with the release of “Version 9” (V9) software in early October 2018, including the hotly-anticipated “Navigate on Autopilot”.  The biggest improvements apply to vehicles with AP2 hardware in them (those with cameras all around the car).  The new update allows the system to utilize all the Autopilot cameras and greatly expands the processing power of the neural network system.

It’s a huge update and essentially means you’re driving an AI / “machine learning” system on wheels.  It’s pretty incredible and unlike anything else available for consumers on the road today.

All The Cameras

Rather than solely relying on the front-facing cameras, Version 9 taps into all Autopilot cameras around the vehicle (the rear camera is not used currently).  The cameras allow the vehicle to keep tabs on objects on all sides while moving forward, including stationary objects such as light poles, etc.

The full view also gives cars better recognition of potential hazards near the road. The cameras can differentiate among a hedge, a human or an animal. The software recognizes nearby stationary vehicles in parking lots on the same side of the road as the moving Tesla car. Instead of just cars or semi-trucks, the cameras recognize and differentiate motorcycles, light-duty trucks, and heavy-duty trucks. The cameras also recognize vehicles or objects in blind spots to help prevent a Tesla car from turning into a lane that’s already occupied by another car.

Here’s footage showing what the cameras actually see:

The ability to use all cameras came to fruition due to an improved neural network with greater processing power that digests videos regardless of the camera view or camera angle. Technically called “camera agnostic,” the system processes images from the cameras no matter how distorted, elongated or grainy the images appear. For example, if the front-facing camera has a fisheye lens while the side-facing camera has a normal lens, the new Autopilot has a better idea of what the objects represent regardless of whether a pedestrian appears upright or tilted in a camera view. This allows Tesla to upgrade camera systems more easily without necessarily making large changes to software.

In addition, the driver information display now also moves the vehicle towards the center of the screen a bit to show vehicles behind the car now, thanks to using all the cameras.  It also better detects lanes and shows other vehicle types, like this:

New Driver Information Display on Version 9

Note that once you receive the update, the car must first calibrate all the cameras.  Some owners have reported that after the update, the car ping-pongs a bit within a lane, but after about 30 to 60 miles, the system has calibrated the cameras.

Smarter Neural Network

To handle all this new data, the Neural Network software had to become much more powerful.  It not only has to process many more camera feeds, but it also has to interpret the objects in those images and then decide how to handle the situation.  Elon Musk has stated that there’s roughly a 400% increase in the number of operations per second in V9 vs. V8.1.

As such, the onboard computers for current models may be nearing processing capacity. Telsa has stated that future models will use Tesla’s own proprietary AI chips (called Hardware 3 or AP3) which are far more powerful and efficient vs. the current NVIDIAs GPU solution that is inherently less efficient since it has to shuttle data between the CPU and GPU.  Tesla has said that owners who have ordered “Full Self Driving” will receive a free hardware upgrade.

Finally, all that data (at least a good portion of it) is anonymously sent back to Tesla headquarters so engineers can train the neural network.  As such, Tesla is able to continually improve how well Autopilot handles different conditions as it receives data back from the entire fleet.  Training data for neural networks is one of the most important factors that determine how quickly a machine can learn.  Given that Tesla now has far more cars on the road, it can utilize all that data to train the network even faster than before.

Navigate on Autopilot

One of the most anticipated new features is Navigate on Autopilot. Navigate on Autopilot helps a driver, under the driver’s supervision, with decision-making on highways. The system suggests when to take an on or off ramp, when to change lanes, and navigating highway interchanges. The overall goal of Navigate on Autopilot is to give drivers the most efficient route to a destination as possible.

The dashboard display shows a blue line for the car’s current lane, with a suggested lane change in gray. The system works by simply inputting a destination into the computer system and pressing Navigate on Autopilot.

Here’s a video from Tesla showing it in action:

You can also adjust several features in the Settings allowing you to receive an audible alert before lane chances, for example.

Navigate on Autopilot Options - April 2019
Screen shot of options

It will automatically take an exit if the exit is directly off the freeway and automatically come to a stop at the first intersection after the exit, in most cases (you should still pay attention, of course).

Other Version 9 Features

Besides a major update to Autopilot, other features rolled out with the Version 9 software update include improvements with mobile device access and camera features, in addition to user interface update for Model X and S drivers, bring the UI inline with the more modern-looking Model 3. On the mobile front, drivers can update Tesla software from their mobile app instead of needing to be in the car. Drivers can also send navigation data to the car’s computer through a mobile app instead of needing to put the information into the computer directly. Passengers may access the car’s entertainment and media features from a mobile app rather than from the vehicle’s control panel.

A new app launcher integrates apps for Calendar, Energy, Web Browser, Rear View Camera, Phone and Charging all in one simple interface. This gives drivers easier access to real-time information. It evens predicts the car’s energy consumption for an upcoming trip based on the navigation inputs a driver uses.

Drives are now able to record and save video from the front-facing camera with a flash drive. All drivers need is 1.8 GB of free space on a configured flash drive to record and save a 10-minute video clip. All of this happens from the front row USB port.

Easter Eggs

One other feature involves a little bit of fun. Tesla encourages drivers to find Easter eggs in the new software update. These hidden gems allow owners to find classic Atari games to play on the car’s infotainment screen. These games turn a Tesla car into a game console thanks to controls on the touchscreen and in the steering wheel.

Our Take

With these updates, Tesla’s Autopilot technology matures by leaps and bounds.  When driving an AP2 hardware Tesla now, you’re literally driving a sophisticated neural network on wheels. No other auto manufacturer is close to having something like this yet at the consumer-grade level.

What We Like

  • Greatly improved road path following (especially left-hand forks)
  • Navigate on Autopilot for freeways
  • Updated 360 degree view of cars and objects around vehicle

What Could Be Better

  • Audible warning for stationary objects detected over 50 MPH
  • Audible warning for objects in blind spot when manually changing lanes
  • Smoother deacceleration in traffic when cars slow down
  • Less abrupt lane centering when lanes merge
  • Avoid driving in blind spots of other cars

As usual, Tesla’s Autopilot system keeps getting better and we’re excited to see what Hardware 3 (AP3) will bring for 2019 and beyond!

Learn More


Most Popular