NVIDIA DRIVE Platform for Self-Driving Overview

Self-driving cars may seem like something out of the future — but thanks to chip supplier Nvidia Corporation’s advances, autonomous driving systems are here today. Nvidia has some competition, to be sure, but many of the major auto manufacturers are teaming up with Nvidia, choosing Nvidia’s DRIVE AutoPilot as their autonomous driving system.

NVIDIA unveiled its new DRIVE AutoPilot system in January, 2019, calling it the world’s “first commercially available Level 2+ automated driving system.” It’s anticipated that self-driving vehicles using the DRIVE AutoPilot system will go into production in 2020. Moving forward from there, Level 3 autonomous driving features are set for introduction to the market in 2021, with future scaling headed toward Level 5 self-driving capacity, at which point vehicles would no longer have pedals or steering wheels for use by the humans inside.

NVIDIA’s DRIVE AutoPilot system

Nvidia is accurate when it calls its DRIVE AutoPilot system the first commercially available Level 2+ autonomous driving system. Scheduled for commercial production in 2020, the DRIVE AutoPilot system makes use of several of the advanced driver-assistance systems (ADAS) features already in use in Tesla’s AutoPilot and Volvo’s Pilot Assist systems, such as automatic emergency braking. However, the Level 2+ system goes far beyond these features, which can’t be updated within the vehicle and which operate on limited data, so they’re unable to, say, hit the brakes in every possible emergency situation.

With a Level 2 system, drivers still need to keep their hands on the wheel, while the ADAS system controls braking, accelerating, and even steering. The NVIDIA Drive system takes this a step further by incorporating driver monitoring features and 360-degree perception. These features use artificial intelligence and deep learning, which power an end-to-end autonomous driving solution that starts with data collection and model training, then moves through simulated testing to result in the final deployment of self-driving vehicles.

By running deep neural networks in parallel, autonomous vehicles can identify hazards on all sides of the car and react to them instantly. The DRIVE software stack is an open platform, which means developers and auto manufacturers can use the entire system or choose individual features for their own systems. The software augments the actions of the human drive by adding sensor fusion and perception functionality. In addition, it makes available source packages, frameworks, compilers, toolkits, and libraries that auto manufacturers can use to develop their own autonomous driving applications. As an added bonus, the DRIVE system can be updated from the cloud, so drivers always have the latest safety features available.

Among the features of DRIVE AutoPilot is “My Route,” a personal mapping feature that creates self-driving routes based on its memory of where the driver has been before. This feature facilitates point-to-point automated driving even when no map is available.

DRIVE AutoPilot System Components

Several vital software stacks combine to vault this driving system into the forefront of the autonomous driving world. Among the crucial components:

DRIVE Xavier

This powerful system-on-a-chip (SoC), which can compute 30 trillion operations per second, powers the DRIVE system — and it does so using only 30 watts of energy (about half what an incandescent light bulb uses). The deep neural networks involved with the Xavier processor allow the self-driving system to recognize traffic signs and lights and to handle land changes, highway merges, and more. This platform, then known as DRIVE PX, was NVIDIA’s original architecture for self-driving vehicles. DRIVE Xavier uses an artificial intelligence system and cloud-based mapping to help self-driving cars understand their locations and environment while anticipating possible hazards, including objects in the road.

DRIVE Pegasus

Designed as a smart cockpit for a futuristic Level 5 autonomous driving system, this artificial intelligence (AI) computer supplies the high-level perception needed for safe autonomous driving. This upgraded software platform can handle an astonishing 320 trillion operations per second. DRIVE Xavier can be easily upgraded to DRIVE Pegasus without many architectural changes, and doing so allows scaling to a higher level of autonomy.

DRIVE IX

This Intelligent Experience (IX) system works within the car to visualize the journey, keep drivers alert to what’s going on, and anticipate the safety and other needs of passengers. This software stack can provide augmented reality (AR) visualization, while intelligent driver monitoring keeps a virtual eye out for signs of drowsiness or distraction on the part of the driver, taking appropriate action when needed.

DRIVE AV

This part of the DRIVE system encompasses the hardware required to map and understand the 360-degree environment around the car in real time. The combination of Light Detection and Ranging (or LIDAR — think of it as high-resolution radar, but with lasers), ultrasonic sensors, radar, and cameras provides the data needed to drive safely. The DRIVE AV software stack uses this data to recognize lane markings, pedestrians, traffic lights, traffic signs, and more. This element of the NVIDIA DRIVE platform is developed in conjunction with auto supplier Continental, leveraging Continental’s software engineering experience in Automotive Safety Integrity Level D safety.

DRIVE Constellation

This data center and cloud-based platform allows simulation testing across every possible road condition or scenario. DRIVE Constellation pairs two servers, one of which runs DRIVE Pegasus, while the other runs DRIVE Sim, a simulation model. The Pegasus module treats each simulated scenario as if it were on the road, allowing testing of self-driving cars without having to take them out on the road. Because of this, manufacturers can now test billions of simulated miles, many orders of magnitude beyond what they’d be able to test in the real world. Advancing to Level 4 and Level 5 autonomous driving is dependent on the safety learning curve developed through these simulations. Among the factors that DRIVE Constellation can control are time of day, weather conditions, visibility, traffic, and dangerous scenarios such as merging into the car’s blind spot or dealing with cars swerving in front of the self-driving vehicle.

Driver and Passenger Safety With Autonomous Vehicles

Of course, many drivers maintain a deep-seated distrust of self-driving vehicles, and NVIDIA is well-aware of that. To answer these concerns, the company has developed its Safety Force Field policy to make sure that every element of the DRIVE AutoPilot focuses on protecting vehicle occupants from collisions and other hazards. The Safety Force Field’s computational framework is designed both to keep self-driving vehicles out of harm’s way and to keep them from causing unsafe situations.

The Safety Force Field uses vehicle sensor data to constantly analyze the real-time environment of each vehicle, predicting what will occur next and determining what actions to take to protect the vehicle and its occupants, to protect other vehicles and people on the road, to avoid creating or escalating unsafe scenarios, and to mitigate any harmful situations.

NVIDIA has developed a four-pillar model created to make self-driving vehicles safe. The NVIDIA pillars of safe autonomous driving are:

  • An artificial intelligence design and implementation platform: This platform requires supercomputers with the ability, power, hardware and software able to process all sensor data gathered in real-time and to respond accordingly.
  • A development infrastructure that supports deep learning: The NVIDIA DRIVE infrastructure uses runtime engines and deep learning compilers rebuilt for automotive requirements.
  • A data center solution for simulation and testing: DRIVE’s simulation platform can simulate billions of VR miles and repeat its regression tests as needed.
  • A best-in-class safety program: NVIDIA’S safety methodology focuses on design and validation redundancy and diversity, while providing lifelong support for the autonomous system and meeting the highest automotive industry safety standards.

NVIDIA’s Competition with Intel / Mobileye

NVIDIA is, of course, not the only competitor in the autonomous driving space. Intel, with its Mobileye division it acquired in 2017, has also made some strides with autonomous driving tech, where it was notably the first to publish a paper on the Responsibility-Sensitive Safety (RSS) standard.

In early April, 2019, in fact, Intel Senior VP and Mobileye CEO Amnon Shashua accused Nvidia of copying its RSS standard. However, that standard was never patented and was in the public domain as an open-source concept. This makes it available to any company that wants to use it, so it’s not surprising that NVIDIA did so. In fact, NVIDIA has made its own Safety Force Field (SFF) system open-source as well, so that other manufacturers can build on it and advance the cause of autonomous driving.

Some have speculated that Intel’s comments were a result of feeling the pressure of NVIDIA’s recent advances in autonomous driving, or a reaction to NVIDIA outbidding Intel earlier in 2019 to buy Mellanox, a maker of supercomputer chips. Unfortunately for Intel, one result of its complaints was to point out that NVIDIA has lapped it in the effort to bring Level 2+ autonomous driving systems to the marketplace, since Intel’s system is not yet commercially available.

NVIDIA and Automakers: Partnerships for the Future

Toyota

In March, 2019, NVIDIA and Toyota announced a partnership, with Toyota planning to put the NVIDIA DRIVE AutoPilot system in its cars. Because Toyota holds 9.46 percent of the global auto market, this announcement is extremely significant to NVIDIA’s future.

Under the NVIDIA-Toyota partnership, which involves both Toyota’s Japanese TRI-AD unit and the U.S.-based Toyota Research Institute, Toyota will train deep neural networks and validate autonomous vehicle tech for its cars. Toyota also becomes the first customer to use NVIDIA’s DRIVE Constellation. The collaboration follows Toyota’s previous adoption in 2017 of NVIDIA’s DRIVE Xavier processor.

Over the long run, Toyota plans to develop autonomous vehicles on two fronts, either of which might employ the NVIDIA platform. Toyota’s Guardian system is designed to operate simultaneously with human drivers, anticipating problems and helping the drivers to respond or stepping in when appropriate. The Chauffeur program expects to develop fully autonomous vehicles to be used by disabled or elderly people.

Toyota isn’t the only automaker diving into the autonomous driving market, though it has the largest share of that market. Volkswagen stands second on the list, with 7.38 percent of the autonomous driving market — and, not coincidentally, Volkswagen is also an NVIDIA partner. Other NVIDIA partners include Mercedes-Benz, Volvo, Isuzu, and several more, totalling 22.13 percent of the autonomous driving marketplace. (While NVIDIA is currently partnering with Tesla, Tesla has announced its plans to switch to its own self-driving system, so it’s not included in the above figures.)

Mercedes-Benz

In fact, NVIDIA’s Toyota announcement came on the heels of its January, 2019 announcement at the Consumer Electronics Show (CES) that Mercedes-Benz had chosen NVIDIA as its partner in AI. NVIDIA and Mercedes-Benz plan to work together to create an overall computer architecture designed to replace the separate processors currently handling self-driving capabilities and smart-cockpit functions. The idea is to treat all these separate function as a single datacenter, resulting in greater efficiency, stronger performance, and built-in redundancies in safety systems. As with other data centers, this plan would also make it easier for Mercedes to add upgrades and extra functions later.

Volvo

A few months previously, in October, 2018, Volvo announced its plans to use DRIVE Xavier to develop Level 2+ autonomous driving system vehicles. The self-driving Volvos should hit the production line in the early 2020s.

Exciting as those Mercedes and Volvo announcements were, though, the partnership between NVIDIA and Toyota is far more significant. As the leading player in the autonomous vehicle marketplace, Toyota’s teaming with NVIDIA is well-positioned to drive and change the entire industry.

The (Autonomous) Road Ahead

The autonomous driving market is expected to quadruple in size in the next few years, growing from $24.2 billion as of early 2019 to $98.3 billion by 2023. That growth is anticipated to lean slightly (60 percent) toward the hardware side of things, with software and related services making up the rest of the market. While NVIDIA’s share of the market is currently small, it’s expected to skyrocket in the next few years thanks to the company’s partnerships with these significant automakers.

Autonomous vehicles are expected to change the face of driving in the not-too-distance future, ultimately creating safer highways. NVIDIA is on track to be a leader in that transformation, thanks to these forward-thinking initiatives with its DRIVE AutoPilot platform.

FOLLOW US

Most Popular