## CMU demo’ing Autonomous SRX in Washington this week

##### 25 June 2014
 Sample images of urban driving and screen-captures of the Autonomoous SRX’s tracking system’s results. The images in the first row show detection and tracking results from an arriving area of Pittsburgh international airport. The other two images in the second row show those of an urban street. Cho et al. (2014) Click to enlarge.

Researchers from Carnegie Mellon University (CMU) this week will demonstrate the CMU advanced Autonomous Cadillac SRX in Washington, DC. The car was brought to Washington at the request of Congressman Bill Shuster of Pennsylvania, who participated in a 33-mile drive in the autonomous vehicle between a Pittsburgh suburb and the city’s airport last September. Scheduled over two days, the demonstration will show how autonomous technology will eventually be fully integrated into vehicles that are currently on the market.

Developed with support from the National Science Foundation (NSF), the US Department of Transportation, DARPA and General Motors, the car is the result of more than a decade of research and development by scientists and engineers at CMU and elsewhere. Their work has advanced the underlying technologies—sensors, software, wireless communications and network integration—required to make sure a vehicle on the road is as safe—and ultimately safer—without a driver than with one. (In the case of the Washington, DC, demonstration, an engineer will be on hand to take the wheel if required.)

This technology has been enabled by remarkable advances in the seamless blend of computation, networking and control into physical objects—a field known as cyber-physical systems. The National Science Foundation has long supported fundamental research that has built a strong foundation to enable cyber-physical systems to become a reality—like Dr. Raj Rajkumar’s autonomous car.

—Cora Marrett, NSF deputy director

Raj Rajkumar, a professor of electrical and computer engineering and robotics at CMU, is a leader not just in autonomous vehicles, but in the broader field of cyber-physical systems, or CPS. Such systems are already in use in sectors such as agriculture, energy, healthcare and advanced manufacturing, and they are poised to make an impact in transportation as well.

In 2007, Carnegie Mellon’s then state-of-the-art driverless car, BOSS, took home the $2-million grand prize in the DARPA Urban Challenge, which pitted the leading autonomous vehicles in the world against one another in a challenging, urban environment. The new vehicle that Rajkumar is demonstrating in Washington is the successor to that vehicle.  BOSS at the Urban Challenge. Click to enlarge. Autonomous SRX. Click to enlarge. Unlike BOSS, which was rigged with visible antennas and large sensors, CMU’s Autonomous Cadillac SRX looks much like any other car on the road. However, top-of-the-line radar, cameras, sensors and other technologies are built into the body of the vehicle. The car’s computers are tucked away under the floor. More specifically, the SRX system includes six radars, six LIDARs, and three cameras; a radar is paired with a LIDAR at different heights to maximize the reliability and range of measurements. With the current sensor layout, any objects within 200 meters will be projected onto the vehicle’s sensing coverage and any objects within 60 meters or so will be seen by at least two different types of sensors (i.e., radar and LIDAR, or radar and camera).  (a) The new autonomous vehicle is designed to minimize alterations of a stock-car appearance while installing multiple sensors to maximize the sensing coverage. (b) Visualization of LIDAR measurement. LIDAR scans acquired from individual sensors are depicted in different color. (c) A horizontal field of view (HFOV) of sensing coverage, emphasizing the coverage around the vehicle. Source: Cho et al. (2014) Click to enlarge. For vision sensors, a forward-looking camera is installed inside the front window next to the rear-view mirror and another is installed at the rear bumper to provide the front and back side of perspective images. The third camera is a thermal camera that captures scenes in infrared spectrum to perceive objects in challenging driving conditions, such as at night and in fog. All sensors are stock-car grade and readily available on the market. Due to the integration of multiple wide-FOV sensors the blind spots are small enough that no vehicle will be overlooked.  The tracking system comprises two basic layers: the sensor and the fusion layer. Cho et al. (2014) Click to enlarge. The goal of CMU’s researchers is to develop a driverless car that can decrease injuries and fatalities on roads. Automotive accidents result in 1.2 million fatalities annually around the world and cost citizens and governments$518 billion. It is estimated that 90% of those accidents are caused by human error.

In addition to controlling the steering, speed and braking, the autonomous systems in the vehicle also detect and avoid obstacles in the road, including pedestrians and bicyclists.

In the demonstration in DC, cameras in the vehicle will visually detect the status of traffic lights and respond appropriately. In collaboration with the DC Department of Transportation, the researchers have added a technology that allows some of the traffic lights in the Capitol Hill neighborhood of Washington to wirelessly communicate with the car, telling it the status of the lights ahead.

NSF has supported Rajkumar’s work on autonomous vehicles since 2005, but it is not the only project of this kind that NSF supports. In addition to CMU’s driverless car, NSF supports Sentry, an autonomous underwater vehicle deployed at Woods Hole Oceanographic Institute, and several projects investigating unmanned aerial vehicles (UAVs) including those in use in search and rescue and disaster recovery operations. Moreover, NSF supports numerous projects that advance the fundamental theories and applications that underlie all autonomous vehicles and other cyber-physical systems.

In the last five years, NSF has invested over \$200 million in CPS research and education, building a foundation for the smart systems of the future.

Resources