Alfa Laval introduces E-PowerPack ORC waste heat recovery system for ships
Washington state legislature approves $16.9B transportation package; targeting all new cars to be electric with MY 2030

Aeva becomes first FMCW 4D LiDAR supported on NVIDIA DRIVE autonomous vehicle platform

Aeva, a developer of next-generation sensing and perception systems, announced that its Aeries 4D LiDAR sensors are now supported on the NVIDIA DRIVE autonomous vehicle platform. Aeva’s Frequency Modulated Continuous Wave (FMCW) 4D LiDAR sensors detect 3D position and instant velocity for each point at distances up to 500 meters, bringing an added dimension to sensing and perception for safe autonomous driving.

NVIDIA DRIVE is an open, end-to-end platform that enables developers to develop, train, test and validate safe self-driving technology at scale.

Aeva delivers a unique advantage for perception in automated vehicles because it leverages per-point instant velocity information to detect and classify objects with higher confidence across longer ranges. With Aeva as part of our DRIVE ecosystem network, we can provide customers access to this next generation of sensing capabilities for safe autonomous driving.

—Gary Hicok, Senior Vice President of Engineering at NVIDIA

In addition to instant velocity detection, Aeva’s sensors have advanced 4D Perception capabilities that deliver new features not possible with typical legacy LiDAR sensors, including Ultra Resolution and 4D Localization.

  • Ultra Resolution: A real-time camera-level image of the world with up to 1,000 lines per frame with no motion blur for the static scene, providing up to 20 times the resolution of legacy time of flight LiDAR sensors. Image segmentation enables the detection of roadway markings, drivable regions, vegetation, road barriers, as well as detecting road hazards such as tire fragments at up to twice the distance of legacy time of flight LiDAR sensors. Instant velocity data allows for high confidence detection and tracking of dynamic objects such as oncoming vehicles and other moving objects at distances up to 500 meters.

  • 4D Localization: Per-point velocity data enables real-time ego vehicle motion estimation with six degrees of freedom, motion compensation and on-line sensor extrinsic calibration to aid with sensor fusion. These vehicle estimation features also enable accurate vehicle positioning and navigation without the need for additional sensors, such as IMU or GPS, for safe autonomous navigation in GPS-denied and featureless environments such as tunnels and parking structures.

Aeva’s FMCW technology also aids developers of autonomous vehicles with these unique advantages over legacy LiDAR sensors that use time-of-flight technology:

  • Freedom from interference from sunlight and other LiDAR sensors.

  • Elimination of retroreflector blooming and ghosting from highly reflective objects such as street signs and roadway markings.

Comments

The comments to this entry are closed.