NVIDIA demonstrated its new Drive AutoPilot driving system at the 2019 Consumer Electronics Show in Las Vegas. NVIDIA has long been working on autonomous driving and now it’s providing a full end-to-end solution for auto manufacturers and suppliers beyond the chips.
Already, NVIDIA’s new system has gotten the attention of automotive suppliers, such as Continental, who plan to integrate them into their own products starting in 2020.
Not Fully Autonomous – Level 2+?
NVIDIA’s system is not designed to provide fully autonomous driving. However, it improves on existing Level 2 (see What Do Self-Driving Levels Mean?) automatic systems in which a 2018 American Institute for Highway Safety study has found to be deficient. The study showed that current lane-keeping and adaptive cruise control was not living up to consumer standards, especially on hilly or curvy roads. NVIDIA’s new system addresses these issues in its new platform. This system can also help drivers merge, park, change lanes, and detect pedestrians more effectively than previous systems.
NVIDIA is calling their system Level 2+, which is controversial since there is no designated Level 2+ in the Self-Driving SAE standards, only Level 2, so calling Level 2+ is a marketing tactic but does not provide any addition real functionality within the SAE Level architecture.
A Swipe at Tesla?
NVIDIA currently supplies chips used by Tesla in it’s Level 2 system called “Autopilot”. Up until 2019, Tesla has been using NVIDIA chips to power Autopilot. However, in early 2019, they are expected to replace NVIDIA with their own chips, called Hardware 3, rumored to by manufactured by Samsung.
Perhaps in an effort to get back at Tesla, NVIDIA also named their product “AutoPilot” (with a capital ‘P’) and developed an instrument display cluster representation very similar to Tesla’s.
NVIDIA’s new technology consumes less battery power while providing more computational power for quick processing than earlier systems. The Drive Autopilot system integrates existing Xavier chip technology to give it a 30 trillion operations per second processing ability. This ability, combined with a deep neural network and 360-degree camera sensors both inside and outside the cabin, allows the system to work quickly and efficiently. These sensors not only monitor what the driver is doing, but also what is going on with other vehicles and pedestrians.
NVIDIA plans to have its new system available to all automotive manufacturers and it will not be exclusive to one particular manufacturer. This technology may show up in a wide range of vehicles and products in the future. Continental AG and Friedrichshafen AG have already contracted to use it in products that are destined to go into production in 2020.