UF/IFAS researchers develop new sorghum cultivars that could produce up 10.5 kL ethanol per hectare
Johns Hopkins study links exposure to coarse particulate matter to increased risk of asthma in children

Renesas Electronics, Dibotics deliver real-time, power-efficient LiDAR processing based on R-Car SoC for autonomous driving; SLAM on a chip

Renesas Electronics Corporation and Dibotics, a leader in real-time 3D LiDAR processing, have collaborated to develop an automotive-grade embedded solution for LiDAR processing used in advanced driver assistance systems (ADAS) and automated driving applications. The jointly-developed solution will enable system manufacturers to develop real-time 3D mapping systems with high level functional safety (FuSa) and low-power consumption.

LiDAR processing today requires an efficient processing platform and advanced embedded software. By combining Renesas’ high-performance image processing, low-power automotive R-Car system-on-chip (SoC) with Dibotics’ 3D simultaneous localization and mapping (SLAM) technology, the companies deliver a SLAM on Chip. (SLAM is a computational algorithm capable of generating and updating a map of an unknown environment while simultaneously keeping track of the vehicle’s location.)

The SLAM on Chip implements 3D SLAM processing on a SoC, a function that used to require a high-performance PC. It also realizes 3D mapping with LiDAR data only, eliminating the need to use inertial measurement units (IMUs) and global positioning system (GPS) data. The collaboration enables a real-time 3D mapping system with low power consumption and high-level functional safety in automotive systems.

As the automotive market prepares for the autonomous-driving era, optimizing the sensor technology required for autonomous vehicles, including real-time, high-definition perception of the environment, precise localization of the vehicle, and real-time sensor fusion, remains a significant challenge.

LiDAR has become a key sensor, providing higher-precision obstacle sensing around the vehicle and real-time electric control unit (ECU) management for vehicle control compared with alternative methods such as cameras and radars. The rapid increase in the amount of data delivered by new LiDAR sensor technologies is driving a growing need for high-performance real-time processing of all this data.

Unlike existing approaches, Dibotics’ Augmented LiDAR software realizes 3D SLAM technology that only requires data from the LiDAR sensor to achieve 3D mapping. It does not require additional input from IMUs, GPS, or wheel encoders (which measure the rotation of a car wheel to derive the speed of the car), which eliminates extra integration efforts, lowers bill-of-material (BOM) costs and simplifies development.

In addition, the software realizes point-wise classification (automatic classification of each point delivered by the LiDAR without using machine learning, previous knowledge or map data), detection and tracking of shape, speed, and trajectory of moving objects, and Multi-LiDAR fusion (automatic classification of each point delivered by the LiDAR without using machine learning, previous knowledge or map data.)

The high-performance capabilities of the R-Car SoC enables it to run Dibotics’ Augmented LiDAR software. The R-Car has low power consumption and also meets the ISO 26262 (ASIL) FuSa standard for high functional safety. Renesas R-Car is part of the Renesas autonomy Platform for ADAS and automated driving that delivers total end-to-end solutions scaling from cloud to sensing and vehicle control.

Dibotics will demonstrate the Augmented LiDAR solution during CES 2018.

Paris-based Dibotics was created in 2015 by Raul Bravo and Olivier Garcia and provides a suite of Intellectual Property, Algorithms and Engineering services to achieve mobile autonomy, advanced 3D perception and situation awareness. The company developed and field-validated a novel a unique 6 Degree of Freedom Sensor-Agnostic Localization technology, relying exclusively on the data originated from a single sensor: no odometry, IMU or multi-sensor fusion are needed. The 3D SLAM (6 DoF) technology provides an accurate real-time position over long distances without drift.


The comments to this entry are closed.