U Magdeburg, VW AG and AEM in 3-year, €2M project to improve EV range, electromagnetic compatibility
Electrovaya-Litarion introduces higher energy density 44Ah Li-ion cell; optimized NMC/graphite

Ford and Baidu invest $150M in Velodyne LiDAR

Velodyne LiDAR, Inc., a global leader in LiDAR (Light, Detection and Ranging) technology, announced the completion of a combined $150 million investment from co-investors Ford Motor Company and China’s leading search engine company Baidu, Inc. The investment will allow Velodyne to rapidly expand the design and production of high-performance, cost-effective automotive LiDAR sensors, accelerating mass adoption in autonomous vehicle and ADAS applications and therefore accelerating the critical, transformative benefits they provide.

Over the last decade, Velodyne developed four generations of hybrid solid-state LiDAR systems incorporating the company’s proprietary software and algorithms that interpret rich data gathered from the environment via highly accurate laser-based sensors to create high-resolution 3D digital images used for mapping, localization, object identification and collision avoidance.

Velodyne’s 3D, real-time LiDAR sensors measure distances by measuring the Time of Flight (TOF) that it takes a short laser pulse to travel from the sensor to an object and back, calculating the distance from the known speed of light.

Vertical fields of view are covered with full 360° horizontal field of view by rotating the laser/detector pairs up to 20 times per second.

In addition to each distance measurement, Velodyne’s LiDAR sensors also measure calibrated reflectivities that allow for easy detection of retro-reflectors like street-signs, license-plates and lane-markings.

Velodyne’s LiDAR solutions are capable of producing 300,000 (Puck Lite, VLP-16) to 2.2 million (HDL-64E) data points per second with a range up to 200 meters at centimeter-level accuracy. The Company’s high-performance LiDAR technology has been recognized by global automotive OEMs and rideshare customers as a critical element enabling the development of fully autonomous vehicles.

LiDAR continues to prove itself as the critical sensor for safe autonomous vehicle operation. This investment will accelerate the cost reduction and scaling of Velodyne’s industry-leading LiDAR sensors, making them widely accessible and enabling mass deployment of fully autonomous vehicles. We are determined to help improve the goal of safety for automotive vehicles as soon as possible, as well as empower the efficiency autonomous systems offer.

—David Hall, founder and CEO, Velodyne LiDAR

From the very beginning of our autonomous vehicle program, we saw LiDAR as key enabler due to its sensing capabilities and how it complements radar and cameras. Ford has a long-standing relationship with Velodyne and our investment is a clear sign of our commitment to making autonomous vehicles available for consumers around the world.

—Raj Nair, Ford Executive Vice President, Product Development and Chief Technical Officer

Baidu also shares Velodyne’s vision to promote safety for autonomous vehicles on a global scale, and in particular in Baidu’s home market in China, where Baidu is already testing its fleet of autonomous vehicles.

Baidu is developing autonomous vehicles with the intention to increase passenger safety and reduce traffic congestion and pollution in China. Our investment will accelerate our efforts in autonomous driving with what, in our view, are the best LiDAR sensors available today and advance Velodyne’s development of increasingly sophisticated LiDAR sensors.

—Jing Wang, Senior Vice President and General Manager of Autonomous Driving Unit of Baidu

Velodyne expects an exponential increase in LiDAR sensor deployments in autonomous vehicles and ADAS applications over the next several years, driving high revenue growth.

To fulfill the high demand for Velodyne’s products, the company will continue to expand its resources across engineering, operations and manufacturing. In connection with this minority investment round, the company plans to expand its board of directors to include two independent industry executives.

Morgan Stanley acted as sole placement agent for this transaction.

Founded in 1983 by David S. Hall, Velodyne Acoustics Inc. first disrupted the premium audio market through Hall’s patented invention of virtually distortion-less, servo-driven subwoofers. Hall subsequently leveraged his knowledge of robotics and 3D visualization systems to invent ground breaking sensor technology for self-driving cars and 3D mapping, introducing the HDL-64 Solid-State Hybrid LiDAR sensor in 2005. By 2007, Hall had transitioned the focus of Velodyne to LiDAR sensors.

Velodyne LiDAR has emerged as the leading supplier of solid-state hybrid LiDAR sensor technology used in a variety of commercial applications including advanced automotive safety systems, autonomous driving, 3D mobile mapping, 3D aerial mapping and security.

The compact, lightweight HDL-32E sensor is available for applications including UAVs, while the VLP-16 LiDAR Puck is a 16-channel LiDAR sensor that is both substantially smaller and significantly less expensive than previous generation sensors.

The company will soon release the “Ultra Puck,” with the specifications similar to the HDL-64, but in a smaller package, designed specifically for the automotive industry. (Earlier post.)



With enough near future lower cost more accurate sensors, ADVs will soon become a reality?

How will those sensors work in heavy snow and heavy rain and fog?


Lots of people are working on solid state Lidar so we can expect it to go mainstream "soon", at say < $500 / car.
It will have to be combined with radar and cameras to give a fully detailed view of the world, which will take some time.
You won't be able to get away with a single camera + radar solution as tesla was originally trying to - it is a good party trick, but there are too many holes in it; hence, lidar, multi cameras + radar.
There are no prizes for the lowest number of cameras in an AV, only for the safest and one able to drive in the widest range of environments. (And maybe the fastest across towns).

"How will those sensors work in heavy snow and heavy rain and fog?" - a very good question - I imagine the cars will have to slow down and use cm GPS and radar mostly.

People are very "good" at driving when they can barely see (snow / fog) but it would be hard to program a machine to do it without making the risks explicit (and thus refusing to do it).
So I imagine, the machine would slow down or ask the driver to take over (or just pull in). If you could watch a film or work in a stopped car, you might be more patient and wait for the weather to pass.


I suspect actual icing /mud would be a bigger issue than snow or rain fall.

The sensors can be programmed to look for solid objects, and look for patterns... Rain and other weather phenomena will obscure some, but not all of the data points, the car can use programing to know to look beyond the near field rain. (LIDAR guns do the same thing for police)

Odds are the vehicle could still have more vision than a human in inclement weather, and possibly even drive normally/optimum speed for the conditions.

Remember that the car can see all the inputs, not just vision and react accordingly within tenths of a second.

Sadly, it may have more sense than most drivers and try and pull off when certain conditions are met.

Flooded/icy/snow packed roads are where I see the most trouble, not precipitation. The car being able to know what the road up ahead brings. Is it 2" of snow? Or 10"? Or water or what have you.

The comments to this entry are closed.