Infiniti’s QX50 Concept to debut in Detroit; potential application for VC-Turbo variable compression ratio engine; autonomous drive support
DOE BETO releases new strategic plan; biofuels to constitute 25% of US transportation fuels by 2040

Lucid Motors chooses Mobileye as partner for autonomous vehicle technology

Luxury EV developer Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles. Lucid will launch its first car, the Lucid Air (earlier post), with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors.

Lucid chose Mobileye to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.

Mobileye’s Road Experience Management (REM) is an end-to-end mapping and localization engine. REM comprises three layers: harvesting agents (any camera-equipped vehicle), map aggregating server (cloud), and map-consuming agents (autonomous vehicle).

  • The harvesting agents collect and transmit data about the driving path’s geometry and stationary landmarks around it. Mobileye’s real-time geometrical and semantic analysis, implemented in the harvesting agent, allows it to compress the map-relevant information, facilitating very small communication bandwidth (less than 10KB/km on average).

  • The relevant data is packed into small capsules called Road Segment Data (RSD) and sent to the cloud. The cloud server aggregates and reconciles the continuous stream of RSDs, resulting in a highly accurate and low Time To Reflect Reality (TTRR) map, called “Roadbook”.

  • Mobileye software running within the map-consuming agent (the autonomous vehicle) automatically localizes the vehicle within the Roadbook by real-time detection of all landmarks stored in it.

Projection of the map data; on the right side the lanes are mapped onto Google Earth. The car is driving based on the map alone. Note the accuracy of the lane markings and landmarks.

Mobileye is expected to provide a dual set of EyeQ4 system-on-chips (earlier post). The fourth-generation EyeQ4 consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding.

The chipset will process a full 8-camera surround view system, providing full 360-degree visual perception. Consistent with other Mobileye programs, the camera set includes a forward-facing trifocal-lensed camera and an additional five cameras surrounding the vehicle.

In the EyeQ systems, Mobileye employs proprietary computation cores (known as accelerators) which are optimized for a wide variety of computer-vision, signal-processing, and machine-learning tasks, including deep neural networks. These accelerator cores have been designed specifically to address the needs of the ADAS and autonomous-driving markets.

Each EyeQ chip features heterogeneous, fully programmable accelerators, with each accelerator type optimized for its own family of algorithms. The fully programmable accelerator cores are :

  • The Vector Microcode Processors (VMP), which debuted in the EyeQ2. The VMP is a VLIW SIMD processor, with cheap and flexible memory access, provides hardware support for operations common to computer vision applications and is well-suited to multi-core scenarios.

  • The Multithreaded Processing Cluster (MPC) was introduced in the EyeQ4.

  • The Programmable Macro Array (PMA) was introduced in the EyeQ4 and now reaches its 2nd generation of implementation in the EyeQ®5. The PMA enables computation density nearing that of fixed-function hardware accelerators without sacrificing programmability.

Mobileye is currently developing its fifth generation SoC, the EyeQ5, to act as the vision central computer performing sensor fusion for Fully Autonomous Driving (Level 5) vehicles that will hit the road in 2020. To meet power consumption and performance targets, EyeQ SoCs are designed in most advanced VLSI process technology nodes—down to 7nm FinFET in the 5th generation.

Mobileye_Page_noWebsite

In addition for Lucid, Mobileye will offer sensor fusion software that incorporates data from radar and lidar sensors, along with the camera set, in order to build the critical environmental model necessary to facilitate autonomous driving.

To complete and strengthen the environmental model, Mobileye’s REM system is intended to provide the vehicle with highly accurate localization capability. Lucid vehicles will benefit from the near real-time updating of the collaborative, dynamic global Roadbook (a detailed, cloud-based map of localized drivable paths and visual landmarks constantly updated in real-time) high-definition mapping system. (Earlier post.) Data generated from Lucid vehicles can be used to enhance the autonomous driving software and will also contribute to the aggregation of Mobileye’s Global Roadbook.

Mobileye’s suite of automated driving technologies represent key elements in the development of automated driving systems in the Lucid Air. Lucid is striving to take a leading position in safety and intuitive usability. We look forward to working with Mobileye on important aspects of achieving these goals.

—Peter Rawlinson, CTO of Lucid Motors

High-definition (HD) mapping company HERE and Mobileye recently announced a plan to enter a strategic partnership that links their respective autonomous driving technologies into an enhanced industry-leading offering for automakers.

Comments

HarveyD

Seems to be a good progressive and comprehensive approach (over 12+ years) to develop test and use a complete system for future ADVs.

Hope that their EyeQV5 will perform all ADV fonctions by 2020 or so.

Competitors will certainly follow at same or relevant speed.

Herman

Horrible, terrible, awful, really really bad idea to use Mobileye. Just dumb and not disruptive and won't work and planet-destroying.

Because TeslaTeslaTeslaTeslaTeslaTeslaTeslaTesla.

Yours,
HenrikChange

Account Deleted

On the contrary Herman. Lucid is currently a young sustainable auto startup and they cannot do everything themselves. They use the best 3rd party supplies for stuff they currently have no ability to do better internally. Tesla did the same when they made their first hardware generation autopilot. They also used a Mobileye vision sensor. So once again you read me all wrong.

My admiration and you could say love for Tesla as an auto company is derived from the fact that they are the only sustainable auto company with real sales that brings hope for a better life for future generations. They provide transportation without destroying the planet in the process. They even make clean energy production possible with their solar products. All the other auto companies are making unsustainable gassers that are one of the most important contributors to the rapid destruction of our planet and the reason we are experiencing an accelerating mass extinction event with over a 1000 times more species going extinct each year than normally. Without a rapid change to 100% sustainable products in everything we consume there will be very few plants and animals left in 50 years and the oceans could rise 210 feet if the poles melt completely because of runaway global warming.

What mankind currently does to life on this planet is wrong and evil. We need to stop it and Tesla is showing the way of how to do it for the all important auto industry and the power industry.

Unless a company has a strategy for going 100% sustainable with their production and products they do not have a strategy at all. The same goes for countries. The future has to be built on sustainability like Tesla’s products or we will not have a future that can sustain life on this planet.

HarveyD

Have to agree with CHANGE and HENRIK. TESLA/Panasonic approach is giving positive results after only 10 years or so.

More REs (production and storage) and more electrified vehicles (BEVs and FCEVs) will eventually help to close current CPPs (followed by NGPPs) and replace current ICEVs.

Many Northern Europe countries (Norway, Denmark etc) and California are leading the way.

Shared electrified ADVs will also contribute by reducing the number of vehicles on the road-streets-highways (after 2020)

SJC

These guys can keep up with NVidia,
they are focused on this application and architecture.

The comments to this entry are closed.