In mid-2021 Tesla announced it’s only using cameras for Autopilot, what it calls Tesla Vision, eliminating radar from its vehicles. In addition, in 2022 it began removing the ultrasonic sensors from vehicles as well (used for parking), leaving people scratching their heads as to whether cameras alone are up to the task.
Adding to the confusion, in December of 2022, rumors began to circulate that Tesla may be bringing an improved radar back to its vehicles with Hardware 4. We’ll have to wait and see whether this is true, but if so, it would indicate that additional sensors are indeed helpful.
Tesla has long held the position that camera sensing for autonomous driving is not only possible but preferred, eschewing additional sensors such as lidar used by other companies such as Cruise and Waymo.
In fact, Elon Musk famously once said that LiDAR is a fool’s errand. Now it seems, perhaps radar is too.
Here’s the head of Tesla’s AI, Andrej Karpathy explaining the technical reasons why no radar or LiDAR is needed and the rationale for going vision-only:
Radar has long been used by automotive manufacturers and suppliers of Advanced Driver Assistance Systems (ADAS) such as those provided by Mobileye and used by many popular car brands such as BMW, Nissan, Volvo, and more (see Cars with Autopilot).
Radar was used historically since it was cost-effective and simple means to detect something relatively large in front of the vehicle, at which point the system would respond by braking, as is the case with Automatic Emergency Braking (AEB) systems or helping to keep separation from another vehicle in the case of Distance Cruise Control.
On the other hand, since radar can’t actually “see” the road, manufacturers added cameras to help detect road markings, in order to keep a vehicle centered with lane markings, for example.
Eight Cameras and Powerful Hardware
Tesla was one of the first to integrate cameras all around the vehicle as standard equipment, allowing the car to see 360 degrees and up to 250 meters. In addition, Tesla has integrated powerful AI computing into its cars (see Hardware 3) that allows the vehicle to process camera images far faster than ever possible before, making other sensors such as radar and ultrasonic sensors redundant, if not confusing for the system to process.
Not Everyone Agrees
This has come as a shock to many in the industry who feel quite the opposite, in fact, that more sensors such as LiDAR are needed, not fewer sensors. It also raises the question of how Autopilot will handle low visibility situations such as in twilight, evening or foggy situations where radar still functions. In fact, Tesla’s website previously proclaimed that its radar “passes through fog, dust, rain, snow and under cars,” and that “radar plays an essential role in detecting and responding to forward objects,” something owners who drive in less than ideal conditions may now need to worry about.
Radar Removed from Tesla Vehicles
Starting in mid-2021, Tesla no longer equipped vehicles with radar sensors and they now rely solely on cameras for Autopilot and Full Self-Driving capabilities (see what’s the difference). This change started with the Model 3 and Model Y in North America, while vehicles outside of North America and the Model S and Model X will continue to include radar for the time being.
However, since then Hardware 4, has started to roll out and there are rumors that along with it, a new, higher-resolution radar is being incorporated, at least in some models, like the Model X and Model S.
Other Manufacturers with Vision Only
Tesla isn’t the only one that relies solely on vision for Automatic Emergency Braking (AEB), Distance Cruise Control, and Lane Keeping Assist functions. Other manufacturers have been doing so for years, most notably Subaru with its EyeSight system.
Camera Systems in Drones
Other devices, such as drones, have proven that it’s possible to navigate very complex 3D environments. For example, Skydio makes the Skydio 2 Drone which uses six cameras to navigate a 3D environment while flying. It uses a NVIDIA Tegra TX2 for computational AI which similar to Tesla’s Full Self-Driving Computer, but less powerful.
The data collected by the cameras creates a detailed map of the environment, similar to LiDAR as shown below:
Upgraded Autopilot Cameras with Hardware 4?
The current front-facing cameras have a 1.2 megapixel resolution (1280 x 960) and it’s believed that Tesla may upgrade those cameras to higher resolution cameras such as the Sony IMX490 which has a 5.44 megapixel resolution (2896 x 1876). These are not only higher resolution but also wide-angle, so may reduce the number of cameras needed upfront (e.g. from 3 to 2, for example).
In addition to being higher resolution, the Sony sensors also have flicker mitigation to better read LED signs which are critical traffic control circumstances.
See all the latest information on Tesla FSD2 HW4, in our Hardware 4 article.
High-Resolution Imaging Radar
While Tesla’s old radar technology wasn’t robust enough Elon has said if radar could be made “high resolution”, then it could be a useful input.
That high resolution may be coming soon. So-called “imaging radar” from manufacturers such as Continental, Magna/Uhnder and Oculii are on the horizon and could be used by vehicle manufacturers in future cars.
Now we know that a new radar unit, called Phoenix will be included in Hardware 4.
While other manufacturers, like Volvo, are adding additional sensors such as LiDAR, Tesla is downgrading its sensor suite. It remains to be seen whether Tesla Vision will be robust enough to handle driving in all situations as originally promised with its optional and expensive Full Self-Driving package, and doing so significantly better than a human driver.