Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.
Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.
Mobileye’s Road Experience Management (REM) is an end-to-end mapping and localization engine. REM comprises three layers: harvesting agents (any camera-equipped vehicle), map aggregating server (cloud), and map-consuming agents (autonomous vehicle).
The harvesting agents collect and transmit data about the driving path’s geometry and stationary landmarks around it. Mobileye’s real-time geometrical and semantic analysis, implemented in the harvesting agent, allows it to compress the map-relevant information, facilitating very small communication bandwidth (less than 10KB/km on average).
The relevant data is packed into small capsules called Road Segment Data (RSD) and sent to the cloud. The cloud server aggregates and reconciles the continuous stream of RSDs, resulting in a highly accurate and low Time To Reflect Reality (TTRR) map, called “Roadbook”.
Mobileye software running within the map-consuming agent (the autonomous vehicle) automatically localizes the vehicle within the Roadbook by real-time detection of all landmarks stored in it.
|Projection of the map data; on the right side the lanes are mapped onto Google Earth. The car is driving based on the map alone. Note the accuracy of the lane markings and landmarks.|
Mobileye is expected to provide a dual set of EyeQ4 system-on-chips (earlier post). The fourth-generation EyeQ4 consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding.
The chipset will process a full 8-camera surround view system, providing full 360-degree visual perception. Consistent with other Mobileye programs, the camera set includes a forward-facing trifocal-lensed camera and an additional five cameras surrounding the vehicle.
In the EyeQ systems, Mobileye employs proprietary computation cores (known as accelerators) which are optimized for a wide variety of computer-vision, signal-processing, and machine-learning tasks, including deep neural networks. These accelerator cores have been designed specifically to address the needs of the ADAS and autonomous-driving markets.
Each EyeQ chip features heterogeneous, fully programmable accelerators, with each accelerator type optimized for its own family of algorithms. The fully programmable accelerator cores are :
The Vector Microcode Processors (VMP), which debuted in the EyeQ2. The VMP is a VLIW SIMD processor, with cheap and flexible memory access, provides hardware support for operations common to computer vision applications and is well-suited to multi-core scenarios.
The Multithreaded Processing Cluster (MPC) was introduced in the EyeQ4.
The Programmable Macro Array (PMA) was introduced in the EyeQ4 and now reaches its 2nd generation of implementation in the EyeQ®5. The PMA enables computation density nearing that of fixed-function hardware accelerators without sacrificing programmability.
Mobileye is currently developing its fifth generation SoC, the EyeQ5, to act as the vision central computer performing sensor fusion for Fully Autonomous Driving (Level 5) vehicles that will hit the road in 2020. To meet power consumption and performance targets, EyeQ SoCs are designed in most advanced VLSI process technology nodes—down to 7nm FinFET in the 5th generation.
In addition for Lucid, Mobileye will offer sensor fusion software that incorporates data from radar and lidar sensors, along with the camera set, in order to build the critical environmental model necessary to facilitate autonomous driving.
To complete and strengthen the environmental model, Mobileye’s REM system is intended to provide the vehicle with highly accurate localization capability. Lucid vehicles will benefit from the near real-time updating of the collaborative, dynamic global Roadbook (a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time) high-definition mapping system. (Earlier post.) Data generated from Lucid vehicles can be used to enhance the autonomous driving software and will also contribute to the aggregation of Mobileye’s Global Roadbook.
Mobileye’s suite of automated driving technologies represent key elements in the development of automated driving systems in the Lucid Air. Lucid is striving to take a leading position in safety and intuitive usability. We look forward to working with Mobileye on important aspects of achieving these goals.—Peter Rawlinson, CTO of Lucid Motors
High-definition (HD) mapping company HERE and Mobileye recently announced a plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.