2013 Cadillac XTS will feature first GM application of sensor fusion; milestone toward semi- and fully autonomous vehicles
|Autonomous sensor systems for vehicle safety. Source: GM. Click to enlarge.|
The 2013 Cadillac XTS will feature GM’s first application of a Driver Assistance Package using sensor fusion, which combines the information of several, generally heterogeneous sensors and positioning technologies to alert drivers more accurately of road hazards and help them avoid crashes. Sensor fusion and the challenges of its implementation have been topics of interest in the active safety community for a number of years.
The introduction of the advanced active safety and driver assistance system also marks a significant milestone toward the development of self-driving vehicles, according to GM. The system’s use of radar, cameras and ultrasonic sensors enables advanced safety features, including:
|Sensor fusion. Click to enlarge.|
- Rear Automatic Braking
- Full-Speed Range Adaptive Cruise Control
- Intelligent Brake Assist
- Forward Collision Alert
- Safety Alert Seat
- Automatic Collision Preparation
- Lane Departure Warning
- Side Blind Zone Alert
- Rear Cross Traffic Alert
- Adaptive Forward Lighting
- Rear Vision Camera With Dynamic Guidelines
- Head Up Display
We believe sensor fusion will enable future active safety systems to handle a greater number of inputs to provide 360 degrees of crash risk detection and enhanced driver assist features. A system that combines the strengths of multiple sensing technologies and expertly manages those inputs can provide advisory, warning, and control interventions to help drivers avoid collisions and save lives.—Bakhtiar Litkouhi, GM Research and Development lab group manager for perception and vehicle control systems
Sensor fusion also is a building block in the development of semi-autonomous and fully autonomous vehicles, which are designed to maintain lane position and adapt to traffic environments. It is envisioned that more sophisticated self-driving technology, that could enable semi- and fully autonomous driving, will be available by the end of the decade.
|GM and V2X|
|General Motors is also developing vehicle-to-vehicle (V2V, or vehicle-to-car V2C) and vehicle-to-infrastructure (V2I) communications (collectively, V2X) systems. (Earlier post.)|
|These systems communicate with devices used by other drivers, pedestrians, bicyclists and roadway infrastructure to provide advance warning about hazards ahead, such as slowed or stalled vehicles, slippery roads, sharp curves or intersections and stop signs.|
|GM has been testing the technology in two mobile platforms: a transponder about the size of a GPS unit and a smartphone application that can be tied to the vehicle’s display unit.|
|The system and underlying architecture enables a smartphone to host a variety of applications and seamlessly integrate with vehicle services. This approach facilitates the deployment of new services without changes to the vehicle architecture, notes Donald Grimm, General Motors Research & Development Center.|
GM’s work on sensor fusion draws on its experience with The Boss, a fully autonomous Chevrolet Tahoe developed by GM, Carnegie Mellon University and other partner companies, and named for GM R&D founder Charles F. “Boss” Kettering. In 2007, The Boss navigated 60 miles of urban traffic, busy intersections and stop signs in less than six hours to win the Defense Advanced Research Projects Agency (DARPA) Urban Challenge competition.
Sensor fusion development also is bolstered by GM’s work on the EN-V, three semi-autonomous electric concept vehicles unveiled at the 2010 Shanghai World Expo. (Earlier post.) By combining GPS with vehicle-to-vehicle communications, distance-sensing and object detection technologies, EN-V can be driven both manually and autonomously, the latter allowing it to automatically select the fastest route based on real-time traffic information.
Among the technologies that GM is looking to develop for future active safety systems is LIDAR, a light detecting and ranging technology that can measure the distance to a vehicle or object by illuminating it, often using pulses from a laser. Although LIDAR is no replacement for driver vision, it can become another set of eyes when visibility has deteriorated due to inclement weather or darkness. When combined with radar, cameras and ultrasonic sensors, LIDAR has potential crash avoidance capability.
A more advanced positioning system, using more accurate GPS and digital mapping, also is expected to play an important role on future active safety systems because it helps locate vehicles in relation to one another. While GPS effectiveness can be limited in urban canyon environments where high-rise buildings can interfere with satellite signals, the technology is still considered an asset when “fused” with other sensing and positioning technologies.
No sensor working alone provides all the needed information. That’s why multiple sensors and positioning technologies need to work together synergistically and seamlessly. Sensor fusion will help facilitate that.—Bakhtiar Litkouhi
Altendorfer, R., Wirkert, S., and Heinrichs-Bartscher, S. (2010) Sensor Fusion as an Enabling Technology for Safety-critical Driver Assistance Systems, SAE Int. J. Passeng. Cars - Electron. Electr. Syst. 3(2):183-192 doi: 10.4271/2010-01-2339
Zeitler, W. and Wybo, D. (2006) Enhanced Sensor Fusion for Car Safety Applications, SAE Technical Paper 2006-01-0598 doi: 10.4271/2006-01-0598.