Infineon and Google ATAP developing technology for gesture sensing and presence detection; automotive applications
Reduced graphene oxide separator improves performance of Li-sulfur battery

DARPA program integrates non-mechanical optical scanning tech on microchip; new class of low-cost, miniature LIDAR could support autonomous vehicle apps

DARPA’s Short-range Wide-field-of-view Extremely agile Electronically steered Photonic EmitteR (SWEEPER) program has successfully integrated breakthrough non-mechanical optical scanning technology onto a microchip. Freed from the traditional scanning architecture of gimbaled mounts, lenses and servos, SWEEPER technology has demonstrated that it can sweep a laser back and forth more than 100,000 times per second, 10,000 times faster than current state-of-the-art mechanical systems.

SWEEPER technology can also steer a laser precisely across a 51-degree arc, the widest field of view yet achieved by a chip-scale optical scanning system. These accomplishments could open the door to a new class of miniaturized, extremely low-cost, robust laser-scanning technologies for LIDAR and other uses. Applications can include autonomous vehicle technology.

SWEEPER technology is to be developed further through DARPA’s Electronic-Photonic Heterogeneous Integration (E-PHI) program, which has already successfully integrated billions of light-emitting dots on silicon to create an efficient silicon-based laser.

Current laser-scanning technologies such as LIDAR require mechanical assemblies to sweep the laser back and forth. These large, slow opto-mechanical systems are both temperature- and impact-sensitive and often cost tens of thousands of dollars each—all factors that limit widespread adoption of current technologies for military and commercial use.

By finding a way to steer lasers without mechanical means, we’ve been able to transform what currently is the largest and most expensive part of laser-scanning systems into something that could be inexpensive, ubiquitous, robust and fabricated using the same manufacturing technology as silicon microchips. This wide-angle demonstration of optical phased array technology could lead to greatly enhanced capabilities for numerous military and commercial technologies, including autonomous vehicles, robotics, sensors and high-data-rate communications.

—Josh Conway, DARPA program manager

Phased arrays—engineered surfaces that control the direction of selected electromagnetic signals by varying the phase across many small antennas—have revolutionized radio-frequency (RF) technology by allowing for multiple beams, rapid scanning speeds and the ability to shape the arrays to curved surfaces. DARPA pioneered radar phased array technologies in the 1960s and has repeatedly played a key role in advancing them in the decades since.

Transitioning phased-array techniques from radio frequencies to optical frequencies has proven exceptionally difficult, however, because optical wavelengths are thousands of times smaller than those used in radar. This means that the array elements must be placed within only a few microns of each other and that manufacturing or environmental perturbations as small as 100 nanometers can hurt performance or even sideline the whole array. The SWEEPER technology sidesteps these problems by using a solid-state approach built on modern semiconductor manufacturing processes.

Under SWEEPER funding, four teams of DARPA-funded researchers have used advanced manufacturing techniques to demonstrate optical phased array technology. These performers include the Massachusetts Institute of Technology; the University of California, Santa Barbara; the University of California, Berkeley; and HRL Laboratories.

SWEEPER research is drawing to a close and DARPA is seeking potential transition partners.



Autonomous driving is coming so fast that folk are likely to get run over by it if they are not watching carefully!


A mini-lidar system would be really useful for driving in difficult lighting conditions, like oblique shadows and night time. It would be more robust than using stereo vision with repeating patterns and easier to analyse. (Won't work for fog, though).
You would probably still want radar though (which people are working on as well).



AFAIK all systems will use multiple diverse sensors, and not just rely on one.

Fortunately as computing power increases rapidly there should be bucket of it to chuck at analysing the masses of data arising from that approach.


Autonomous driving will throw a bucket load of sensors at the problem.
What is incredible is that humans can do the equivalent just with a colour, high dynamic range stereo visual system.
(I think it is our ability to deal reasonably with uncertainty and just guess things, or just keep going straight one when we can't see where to go.)

The net result of the ability to deal with uncertainty is the ability to drive when we really shouldn't, and the accident statistics.


The soonest human drivers are assisted and/or replaced the safer will be our roads, streets, pedestrian, cyclists, dogs, etc


Multiple much lower cost sensors, powerful, high speed low energy consumption on-board computers are coming out soon for future autonomous drive vehicles.

Taxis Mons will be liberated from driving the kids around. Driverless e-cars or e-UBER units will do it safely and more rapidly.

The comments to this entry are closed.