PNNL-Lanzatech team hits milestone on waste-gas-to-ethanol-to-jet project
Zhejiang University team investigates emissions from methanol-gasoline blends

Solid-state LiDAR company Quanergy raises $90M in Series B; valuation passes $1B

Quanergy Systems, Inc., a leading provider of solid-state LiDAR sensors and smart sensing solutions (earlier post), raised $90 million in Series B funding at a valuation well over $1 billion. Sensata Technologies, Delphi Automotive, Samsung Ventures, Motus Ventures and GP Capital participated in the round. This investment brings the company’s total funds raised to approximately $150 million.

Quanergy intends to use the investment and leverage its intellectual property to work with its partners in ramping up the production of its solid-state LiDAR sensors. These sensors use standard semiconductor manufacturing processes and have no moving parts on a macro scale or a micro scale, offering significantly lower cost, higher reliability, superior performance, increased capability, smaller size and lower weight when compared to traditional mechanical sensors, sometimes named hybrid solid state sensors.

Quanergy will continue the global expansion of the company and scale its operations and infrastructure to meet the growing demand for autonomy in vehicles and other systems that can benefit from increased levels of automation to save lives, save space, save time, save energy and save costs. According to industry forecasts, the LiDAR market is expected to exceed $1 billion by 2020 and $3 billion by 2022.

Innovation in LiDAR technology represents one of the largest opportunities unfolding around the globe, and this infusion of funding will enable us to accelerate development, scale faster and expand our world-class engineering team.

—Dr. Louay Eldada, Quanergy CEO

Since launching in 2012, Quanergy has been the only company to develop a compact, low-cost, automotive-grade solid-state LiDAR sensor. The Quanergy S3 solid state LiDAR was introduced and successfully demonstrated at CES 2016, mounted in a Mercedes-Benz GLE450 AMG coupé.

LiDAR (light detection and ranging) uses light pulses is emitted to determine a distance based on runtime and speed of light. LiDAR is an optical method for measuring distance and speed that is very similar to radar, except that laser pulses are used instead of radio waves.

Mechanical LiDAR units us a laser and sensor that are physically moved around to build up their view of 3D space (e.g., spinning in a circle).

Quanergy’s solid state LiDAR uses an optical phased array as a transmitter, which can steer pulses of light by shifting the phase of a laser pulse as it’s projected through the array.

Quanergy1

Quanergy1

The S3 offers a number of capabilities that are software-controlled in real time:

  • Adjustable window within total available field of view;
  • Arbitrary distribution of points in point cloud; point density within a frame not necessarily uniform (e.g., denser distribution around horizon in vehicle);
  • Random access for maximum SNR at receiver;
  • Largest VFOV (matches 120 HFOV);
  • Zoom in & out for coarse & fine view;
  • Adjustable frame rate based on situation analysis; and
  • Directional range enhancement based on location in pre-existing map (e.g., maximum forward range on highway, maximum sideways range at intersection).

In May 2016, Quanergy unveiled the all-new S3-Qi, a miniature solid-state LiDAR sensor that is 15% the size of the previous S3 model. The S3-Qi sensor advanced the state of the art in solid-state LiDAR technology by setting new benchmarks in size with a 1" x 1.5" footprint; weight at about 100 grams; and low power consumption.

“When LiDARs are mission critical, as in autonomous cars, they cannot have moving parts and replace sensors in today’s sensing suite; they must be solid state.”
—Louay Eldada

The small form factor, combined with a cost-effective design, makes the S3-Qi well suited in multiple applications including drones, intelligent robotics, security, smart homes and industrial automation. Mass production of the S3-Qi is targeted for Q1 2017.

The company is working to commercialize these cost-effective, capable and robust sensors critical for advanced driver assistance systems (ADAS) and autonomous driving applications, and currently has pre-production contracts with multiple global customers for these solid state sensors.

The company’s LiDAR sensors, as well as sensing systems that benefit from its advanced artificial intelligence perception software, are key to improved safety and efficiency in industries ranging from transportation and security to industrial automation and 3D terrestrial and aerial mapping.

Comments

mahonj

Looks like we have a unicorn here.
Solid state Lidar would be a huge benefit, especially where you can vary the scan positions frame by frame (in theory).
I suppose the thing now is to deliver on the promise.
(Hence the cash).

Account Deleted

Lidar is not mission critical for self-driving cars. Stereo cameras can make 3D maps just as Lidar can and in the same detail. Even radar can make 3D maps but not yet in the same detail. Please explain why anyone thinks Lidar is mission critical for autonomous cars. Only advantage I see with Lidar is that it takes less GPU power to create a 3D map using Lidar than using cameras. However, if computing power is not a limiting factor to get the job done then Lidar is not mission critical.

Account Deleted

I may add that for high definition 3D mapping you will use light sensitive black and white stereo cameras so they can function at night like lidar. Cameras will require more computing power and more advanced software than a lidar sensor making the same quality 3D maps because the lidar signal is more simple than the camera signal. However, if you got the computing power and has solved the software issues then Lidar has no advantages over cameras and they cost much more.

HarveyD

Future LIDAR on a chip (MIT) will be so cheap that ADVs will have at least 3 or 4 on different operation conditions to see better through heavy rain, snow and fog. Ultra high speed computing of sensors' output will not be a major problem with near future ultra high speed CPUs and GPUs from various sources.

It is just a matter of time before a practical (low cost) ADV is in full operation.

The comments to this entry are closed.