At CES 2016, NVIDIA introduced NVIDIA DRIVE PX 2—a high-performance computing platform for in-vehicle artificial intelligence applied to the complexities inherent in autonomous driving. DRIVE PX 2 utilizes deep learning on NVIDIA’s most advanced GPUs for 360-degree situational awareness around the car, to determine precisely where the car is and to compute a safe, comfortable trajectory.
DRIVE PX 2—which delivers processing power equivalent to 150 MacBook Pros—uses two next-generation Tegra processors plus two next-generation discrete GPUs, based on the Pascal architecture, to deliver up to 24 trillion deep learning operations per second, which are specialized instructions that accelerate the math used in deep learning network inference. That’s more than 10 times the computational horsepower than the previous-generation DRIVE PX.
|Jen-Hsun Huang, co-founder and CEO, NVIDIA, introducing the DRIVE PX 2 at CES 2016. Click to enlarge.|
For general purpose floating point operations, DRIVE PX 2’s multi-precision GPU architecture is capable of up to 8 trillion operations per second—more than four times more than the previous-generation product. This enables partners to address the full breadth of autonomous driving algorithms, including sensor fusion, localization and path planning. It also provides high-precision compute when needed for layers of deep learning networks.
DRIVE PX 2’s deep learning capabilities enable it quickly to learn how to address the challenges of everyday driving, such as unexpected road debris, erratic drivers and construction zones. Deep learning also addresses numerous problem areas where traditional computer vision techniques are insufficient, such as poor weather conditions like rain, snow and fog, and difficult lighting conditions like sunrise, sunset and extreme darkness.
|The liquid-cooled DRIVE PX 2 unit. 12 CPU cores, Pascal GPU, 8 TFLOPS, 24 DL TOPS, 16 nm FF, 250 W. Click to enlarge.|
Self-driving cars use a broad spectrum of sensors to understand their surroundings. DRIVE PX 2 can process the inputs of 12 video cameras, plus LiDAR, radar and ultrasonic sensors. It fuses them to detect objects accurately, to identify them, to determine where the car is relative to the world around it, and then to calculate its optimal path for safe travel.
This complex work is facilitated by NVIDIA DriveWorks, a suite of software tools, libraries and modules that accelerates development and testing of autonomous vehicles. DriveWorks enables sensor calibration, acquisition of surround data, synchronization, recording and then processing streams of sensor data through a complex pipeline of algorithms running on all of the DRIVE PX 2’s specialized and general-purpose processors. Software modules are included for every aspect of the autonomous driving pipeline, from object detection, classification and segmentation to map localization and path planning.
NVIDIA delivers an end-to-end solution—consisting of NVIDIA DIGITS and DRIVE PX 2—for both training a deep neural network, as well as deploying the output of that network in a car. DIGITS is a tool for developing, training and visualizing deep neural networks that can run on any NVIDIA GPU-based system—from PCs and supercomputers to Amazon Web Services and the recently announced Facebook Big Sur Open Rack-compatible hardware. The trained neural net model runs on NVIDIA DRIVE PX 2 within the car.
Since NVIDIA delivered the first-generation DRIVE PX last summer, more than 50 automakers, Tier 1 suppliers, developers and research institutions have adopted NVIDIA’s AI platform for autonomous driving development.
Using NVIDIA's DIGITS deep learning platform, in less than four hours we achieved over 96 percent accuracy using Ruhr University Bochum’s traffic sign database. While others invested years of development to achieve similar levels of perception with classical computer vision algorithms, we have been able to do it at the speed of light.—Matthias Rudolph, director of Architecture Driver Assistance Systems at Audi
The DRIVE PX 2 development engine will be generally available in the fourth quarter of 2016. Availability to early access development partners will be in the second quarter. During the press conference, Huang announced that Volvo will be the first automaker to deploy DRIVE PX 2.
In a public trial of autonomous driving, the Swedish automaker next year will lease to customers 100 XC90 luxury SUVs outfitted with DRIVE PX 2 technology. The technology will help the vehicles drive autonomously around Volvo’s hometown of Gothenburg, and semi-autonomously elsewhere.
AdasWorks worked with Volvo to help create a system that processes data from multiple sensors in real time to provide 360-degree detection of lanes, vehicles, pedestrians, signs and more, enabling a variety of autopilot functions.