Stanford team develops sodium-ion battery with performance equivalent to Li-ion, but at much lower cost
Deutsche Post DHL Group, ZF deploying test fleet of autonomous electric delivery trucks based on NVIDIA Drive PX

NVIDIA introduces new Drive PX AI computing platform for Level 5 autonomous vehicles: Pegasus

At the GPU Technology Conference in Munich, NVIDIA unveiled Pegasus, its new NVIDIA DRIVE PX AI computing platform designed to handle Level 5 driverless vehicles. NVIDIA DRIVE PX Pegasus delivers more than 320 trillion operations per second—more than 10x the performance of its predecessor, NVIDIA DRIVE PX 2, which was introduced in January 2016. (Earlier post.)

NVIDIA DRIVE PX Pegasus is powered by four high-performance AI processors. It couples two of NVIDIA’s newest Xavier system-on-a-chip (SoC) processors (earlier post)—featuring an embedded GPU based on the NVIDIA Volta architecture—with two next-generation discrete GPUs with hardware created for accelerating deep learning and computer vision algorithms. The system will provide the enormous computational capability for fully autonomous vehicles in a computer the size of a license plate, drastically reducing energy consumption and cost.

Nvidia-drive-px-pegasus

Pegasus is designed for ASIL D certification—the industry’s highest safety level—with automotive inputs/outputs, including CAN (controller area network); Flexray; 16 dedicated high-speed sensor inputs for camera radar, LiDAR and ultrasonics; plus multiple 10 Gbit Ethernet connectors. Its combined memory bandwidth exceeds 1 terabyte per second.

The computational requirements of fully autonomous vehicles (which NVIDIA refers to as robotaxis) are enormous— perceiving the world through high-resolution, 360-degree surround cameras and LiDARs; localizing the vehicle within centimeter accuracy; tracking vehicles and people around the car; and planning a safe and comfortable path to the destination. All this processing must be done with multiple levels of redundancy to ensure the highest level of safety. The computing demands of driverless vehicles are easily 50 to 100 times more intensive than the most advanced cars today, NVIDIA says.

Of the 225 partners developing on the NVIDIA DRIVE PX platform, more than 25 are developing fully autonomous robotaxis using NVIDIA CUDA GPUs. Today, their trunks resemble small data centers, loaded with racks of computers with server-class NVIDIA GPUs running deep learning, computer vision and parallel computing algorithms. Their size, power demands and cost make them impractical for production vehicles.

Pegasus reduces the volumetric requirements from this:

Nvidia1

To this:

Nvidia2

Pegasus will be available to NVIDIA automotive partners in the second half of 2018. NVIDIA DriveWorks software and NVIDIA DRIVE PX 2 configurations are available today for developers working on autonomous vehicles and algorithms.

NVIDIA DRIVE PX Platform. The NVIDIA DRIVE PX platform scales from a single mobile processor configuration delivering Level 2+/Level 3 capabilities to a combination of multiple mobile processors and discrete GPUs for full Level 5. These configurations run on a single, open software architecture. This enables automakers and tier 1 suppliers to move from development into production for a wide range of self-driving solutions—from AutoCruise on the highway, to AutoChauffeur for point-to-point travel, to Pegasus for a fully autonomous vehicle.

NVIDIA DRIVE PX is part of a broad family of NVIDIA AI computing solutions. Data scientists who train their deep neural networks in the data center on the NVIDIA DGX-1 AI supercomputer can seamlessly run on NVIDIA DRIVE PX inside the vehicle. The unified architecture enables the same NVIDIA DRIVE software algorithms, libraries and tools that run in the data center also perform inferencing in the car.

This cloud-to-car approach enables cars to receive over-the-air updates to add new features and capabilities throughout the life of a vehicle.

Comments

HarveyD

Amazing speed with low energy consumption offers great potential for near future level 5 ADVs. Will sensors and software keep up?

The comments to this entry are closed.