In mid-2021 Tesla announced it’s only using cameras for Autopilot, what it calls Tesla Vision, eliminating radar from its vehicles.
Tesla has long held the position that camera sensing for autonomous driving is not only possible but preferred, eschewing additional sensors such as lidar used by other companies such as Cruise and Waymo.
In fact, Elon Musk famously once said that LiDAR is a fool’s errand. Now it seems, perhaps radar is too.
Here’s the head of Tesla’s AI, Andrej Karpathy explaining the technical reasons why no radar or LiDAR is need and the rationale for going vision-only:
Radar has long been used by automotive manufacturers and suppliers of Advanced Driver Assistance Systems (ADAS) such as those provided by Mobileye and used by many popular car brands such as BMW, Nissan, Volvo, and more (see Cars with Autopilot).
Radar was used historically since it was cost-effective and simple means to detect something relatively large in front of the vehicle, at which point the system would respond by braking, as is the case with Automatic Emergency Braking (AEB) systems or helping to keep separation from another vehicle in the case of Distance Cruise Control.
On the other hand, since radar can’t actually “see” the road, manufacturers added cameras to help detect road markings, in order to keep a vehicle centered with lane markings, for example.
Eight Cameras and Powerful Hardware
Tesla was one of the first to integrate cameras all around the vehicle as standard equipment, allowing the car to see 360 degrees and up to 250 meters. In addition, Tesla has integrated powerful AI computing into its cars (see Hardware 3) that allows the vehicle to process camera images far faster than ever possible before, making other sensors such as radar and ultrasonic sensors redundant, if not confusing for the system to process.
Not Everyone Agrees
This has come as a shock to many in the industry who feel quite the opposite, in fact, that more sensors such as LiDAR are needed, not fewer sensors. It also raises the question of how Autopilot will handle low visibility situations such as in twilight, evening or foggy situations where radar still functions. In fact, Tesla’s website previously proclaimed that its radar “passes through fog, dust, rain, snow and under cars,” and that “radar plays an essential role in detecting and responding to forward objects,” something owners who drive in less than ideal conditions may now need to worry about.
Radar Removed from Tesla Vehicles
Starting in mid-2021, Tesla will no longer equip vehicles with radar sensors and will rely solely on cameras for its Autopilot and Full Self-Driving capabilities (see what’s the difference). This change started with the Model 3 and Model Y in North America, while vehicles outside of North America and the Model S and Model X will continue to include radar for the time being.
Other Manufacturers with Vision Only
Tesla isn’t the only one that relies solely on vision for Automatic Emergency Braking (AEB), Distance Cruise Control, and Lane Keeping Assist functions. Other manufacturers have been doing so for years, most notably Subaru with its EyeSight system.
Camera Systems in Drones
Other devices, such as drones, have proven that it’s possible to navigate very complex 3D environments. For example, Skydio makes the Skydio 2 Drone which uses six cameras to navigate a 3D environment while flying. It uses a NVIDIA Tegra TX2 for computational AI which similar to Tesla’s Full Self-Driving Computer, but less powerful.
The data collected by the cameras creates a detailed map of the environment, similar to LiDAR as shown below:
While other manufacturers, like Volvo, are adding additional sensors such as LiDAR, Tesla is downgrading its sensor suite. It remains to be seen whether Tesla Vision will be robust enough to handle driving in all situations as originally promised with its optional and expensive Full Self-Driving package, and doing so significantly better than a human driver.