[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Jaguar Land Rover “Bike Sense” uses color, sound and vibration to help prevent accidents involving bicycles and motorbikes
January 21, 2015
|Jaguar Land Rover is developing a range of new technologies that would use colors, sounds and touch inside the car to alert drivers to potential hazards and prevent accidents involving bicycles and motorbikes. Click to enlarge.|
At its Advanced Research Center in the UK, Jaguar Land Rover is developing a range of new technologies it calls “Bike Sense” that uses colors, sounds and touch (vibration) inside the car to alert drivers to potential hazards and prevent accidents involving bicycles and motorbikes. Nearly 19,000 cyclists are killed or injured on UK roads every year.
Sensors on the car will detect when another road user is approaching and identify it as bicycle or motorbike. Bike Sense will then make the driver aware of the potential hazard before the driver sees it. Rather than using a generic warning icon or sound, which takes time for the driver’s brain to process, Bike Sense uses lights and sounds that the driver will instinctively associate with the potential danger.
New system uses monocular camera instead of expensive laser scanners for automated vehicle navigation with comparable performance
January 20, 2015
A doctoral candidate in computer science and engineering at the University of Michigan has developed a new software system that could reduce the high cost of laser scanners used in self-driving and automated cars by enabling the vehicles to navigate using a single monocular camera with the same level of accuracy as the laser scanners at a fraction of the cost. His paper detailing the system recently was named best student paper at the Conference on Intelligent Robots and Systems in Chicago.
Ryan Wolcott’s system builds on the navigation systems used in other self-driving cars that are currently in development, including Google’s vehicle. These use three-dimensional laser scanning technology to create a real-time map of their environment, then compare that real-time map to a pre-drawn map stored in the system. By making thousands of comparisons per second, they’re able to determine the vehicle's location within a few centimeters.
Honda introducing first predictive cruise control system in CR-V in Europe
January 09, 2015
Honda will introduce in Europe the first predictive cruise control system, which Honda calls Intelligent Adaptive Cruise Control (i-ACC), capable of foreseeing and automatically reacting to other vehicles cutting-in to the equipped vehicle’s lane.
Based on extensive real-world research of typical European driving styles, Honda’s Intelligent Adaptive Cruise Control (i-ACC) uses a camera and radar to sense the position of other vehicles on the road. It then applies an algorithm to predict the likelihood of vehicles in neighboring lanes cutting-in by evaluating relations between multiple vehicles, enabling the i-ACC-equipped vehicle to react quickly, safely and comfortably.
Ford announces Smart Mobility plan; 25 initial projects
January 06, 2015
At CES, Ford CEO Mark Fields announced “Ford Smart Mobility”—a plan to use innovation to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience and big data. The initial step is the creation of 25 mobility experiments across the globe designed to help change the way the world moves.
Smart Mobility builds upon Ford’s Blueprint for Mobility (earlier post). As outlined by Ford Motor Company Executive Chairman Bill Ford in his keynote at the 2012 Mobile World Congress in Barcelona, the Blueprint for Mobility defines the start of Ford’s thinking on what transportation will look like in 2025 and beyond, and the technologies, business models and partnerships needed to get there.
NVIDIA introduces DRIVE automotive computers at CES; teraflops of processing for autonomous driving and cockpit visualization
January 05, 2015
At CES in Las Vegas, NVIDIA introduced its DRIVE line of automotive computers, equipped with powerful capabilities for computer vision, deep learning and advanced cockpit visualization. NVIDIA will offer two car computers: NVIDIA DRIVE PX, for developing auto-pilot capabilities, and NVIDIA DRIVE CX, for creating the most advanced digital cockpit systems.
The NVIDIA DRIVE PX auto-pilot development platform provides the technical foundation for cars with completely new features that draw heavily on recent developments in computer vision and deep learning. DRIVE PX leverages the new NVIDIA Tegra X1 mobile super chip, which is built on NVIDIA’s latest Maxwell GPU architecture and delivers more than one teraflops of processing power, giving it more horsepower than the world’s fastest supercomputer of 15 years ago.
BMW to show 360-degree collision avoidance and fully-automated remote parking in multi-story garages at CES
December 15, 2014
|The driver has the i3 park itself in a multi-story garage using a smartwatch. Click to enlarge.|
BMW will demonstrate new driver assistance and automated control functions including 360-degree collision avoidance and fully-automated parking in multi-story parking garages at the upcoming Consumer Electronics Show (CES) 2015 in January.
The platform for 360-degree collision avoidance is secure position and environment recognition; the research vehicle is a BMW i3. Four advanced laser scanners record the environment and reliably identify impediments such as columns, for example in a multi-story parking garage. If the vehicle approaches a wall or a column too quickly, the system brakes automatically to prevent the threat of collision. The vehicle is brought to a standstill very precisely with centimeters to spare.
IBM Research and ASELSAN to collaborate on metal-air battery technology, focusing on EVs; mm-wave ICs
November 25, 2014
IBM Research and Turkish defense industry technology company ASELSAN (Askerî Elektronik Sanayii, Military Electronic Industries) have signed collaborative development agreements concerning research and development of metal-air battery technologies and millimeter wave integrated circuits. The companies will work together on these projects, and through these efforts ASELSAN will enhance its in-house research and development activities.
In 2009, IBM and its partners launched a multi-year research initiative specifically exploring rechargeable Li-air systems (one type of metal-air battery): “The Battery 500 Project”. (Earlier post.) The “500” stands for a target range of 500 miles/800 km per charge, which translates into a battery capacity of about 125 kWh at an average use of 250 Wh/mile for a standard family car.
BMW i Ventures makes strategic investment in smartphone driving analytics company Zendrive
The BMW Group’s venture capital company, BMW i Ventures, has made an investment in the startup Zendrive, a company that uses data and analytics gathered from smartphones rather than OBD to improve driving through driving analytics. Related BMW mobility services investments currently include JustPark, Chargepoint, Life360, Chargemaster, and MyCityWay. The Zendrive investment is the first in a series of investment announcements that will be made in the months ahead, said Ulrich Quay, Managing Director of BMW i Ventures, LLC.
Zendrive uses the sensors on a smartphone to measure a driver’s behaviors; Zendrive’s Driver-Centric Analytics process the data from the phone’s sensors into a Zendrive score that factors in cell phone use, speed, swerves, hard stops, fast accelerations, fatigue, as well as weather, trip duration, time of day, and more.
New Toshiba image-recognition processors for ADAS; night-time pedestrian detection and 3D reconstruction
November 14, 2014
Toshiba Corporation will expand its line-up of image-recognition processors for automotive applications with the launch of the TMPV760 series. Sample shipments of the first device, TMPV7608XBG, will start in January 2015, with mass production scheduled for December 2016 onwards.
With TMPV7608XBG, Toshiba is supporting the realization of next-generation Advanced Driver Assistance Systems (ADAS). Support for standard ADAS features includes AEB (Autonomous Emergency Braking); TSR (Traffic Sign Recognition); LDW (Lane Departure Warning) and LKA (Lane Keeping Assist); HBA (High Beam Assistance); FCW (Forward Collision Warning); plus new applications that include TLR (Traffic Light Recognition) and AEB pedestrian (during both day and night), which will become part of the Euro NCAP testing program in 2018.
ONR developing offensive autonomous swarming capability for unmanned surface vehicles; adapting JPL’s CARACaS
October 05, 2014
The Office of Naval Research (ONR) is developing an autonomous offensive swarming capability for unmanned surface vehicles (USVs) not only to protect Navy ships, but also, for the first time, to attack hostile vessels.
The technology under development—based on the Control Architecture for Robotic Agent Command and Sensing (CARACaS) developed by NASA’s Jet Propulsion Laboratory (JPL)—can be put into a transportable kit and installed on almost any boat. It allows boats to operate autonomously, without a Sailor physically needing to be at the controls. Capabilities include operating in sync with other unmanned vessels; choosing their own routes; swarming to interdict enemy vessels; and escorting/protecting naval assets.
Audi to demonstrate automated driving technology in Florida
July 26, 2014
Audi will be the first to test its automated driving technology on the Lee Roy Selmon Expressway in Tampa, Florida—which recently was designated as an automated driving and connected car test bed—using an Audi A7 equipped to handle piloted driving functions on freeway conditions up to 40 mph (64 km/h).
Audi believes this initial version of piloted driving—Traffic Jam Pilot—could be available to consumers within five years. As Audi outlined this type of piloted driving functionality at CES in 2013 (earlier post), the system is based on the functionality of Audi adaptive cruise control with Stop & Go, extended by adding the component of lateral guidance.
IHS: continued legislative focus on pollutants to drive sensor market for internal combustion engines
July 24, 2014
The global market for sensors used in internal combustion engines (ICE) is on the road of steady growth for the next few years, propelled by increasing utilization in engine management and exhaust aftertreatment, according to a new report from IHS Technology. IHS projects that sensor shipments for ICEs will top 1.34 billion units in 2019, up from about 1.08 billion in 2013. Overall, IHS expects a six-year compound annual growth rate (CAGR) from 2013 to 2019 of 3.6%.
The report—“Powertrain Sensor Market Tracker – H1 2014”—is part of the Semiconductors & Components service of IHS Technology. The report examines more than 20 sensors attached to the engine, fuel and exhaust systems of passenger vehicles. The list includes pressure sensors, devices to monitor flow and temperature, ceramic sensors for the gases nitrogen oxide (NOx) and oxygen, in addition to knock sensing, position and speed.
Non-intrusive bio-monitoring system anticipates driver fatigue in the vehicle to prevent accidents
July 23, 2014
The Instituto de Biomecánica de Valencia (Biomechanics Institute - IBV) and its consortium partners in the European project HARKEN have developed a non-intrusive system integrated into smart materials which is capable of monitoring cardiac and respiratory rhythms in order to prevent drivers from falling asleep. The two-year project had its final meeting in June.
The system is based on three main components: the seat sensor, the seat belt sensor and the signal-processing unit (SPU), which processes the sensor data in real time. All are invisible to the user.
Pisa, Deutsche Telekom and Kiunsys launch smart city pilot project to optimize inner city parking as part of ITS; POSSE
June 26, 2014
The Italian city of Pisa and Deutsche Telekom have launched a smart city pilot project to test an intelligent parking system and to analyze historical traffic data via a “big data” service. The system, which will integrate into Pisa’s intelligent transport system (ITS), will help motorists in Pisa find a free parking space more easily and quickly, as well as pay for it via their smart phone.
The city of Pisa worked with Deutsche Telekom and its partner firm Kiunsys to install the new smart city service on Piazza Carrara, located directly on the banks of the river Arno. Wireless Parking Spots Sensors (PSS) on the floor of each parking spot detect whether the spaces are free or occupied. Several data units collect the information and send it over the mobile network to the city’s server infrastructure. The information is then displayed on indication panels which guide drivers to a free space. The solution is also integrated in Pisa’s existing Tap&Park app which drivers can choose to download to take them directly to a free parking space and even pay for it via the app.
CMU demo’ing Autonomous SRX in Washington this week
June 25, 2014
Researchers from Carnegie Mellon University (CMU) this week will demonstrate the CMU advanced Autonomous Cadillac SRX in Washington, DC. The car was brought to Washington at the request of Congressman Bill Shuster of Pennsylvania, who participated in a 33-mile drive in the autonomous vehicle between a Pittsburgh suburb and the city’s airport last September. Scheduled over two days, the demonstration will show how autonomous technology will eventually be fully integrated into vehicles that are currently on the market.
Developed with support from the National Science Foundation (NSF), the US Department of Transportation, DARPA and General Motors, the car is the result of more than a decade of research and development by scientists and engineers at CMU and elsewhere. Their work has advanced the underlying technologies—sensors, software, wireless communications and network integration—required to make sure a vehicle on the road is as safe—and ultimately safer—without a driver than with one. (In the case of the Washington, DC, demonstration, an engineer will be on hand to take the wheel if required.)
Google focusing autonomous driving development on mastering city street driving; patents piling up
April 29, 2014
Over the past year, Google has shifted the focus of its autonomous vehicle project onto mastering city street driving, according to Chris Urmson, Director, Google Self-Driving Car Project. (Urmson was the technical team leader of the CMU team that won the DARPA 2007 Urban Challenge, an autonomous vehicle race.)
Google’s autonomous cars use video cameras; 4 radar sensors (front, back, left, right); a laser range finder (Velodyne HDL-64E LiDAR) to “see” other traffic; and a GPS as well as a wheel encoder and very detailed road maps to determine the precise location of the vehicle. Google says that its autonomous vehicles have logged nearly 700,000 autonomous miles (1.13 million km) over the four years they have been on the road. Since its last pubic update in 2012, Google said that it has logged thousands of miles on the streets of Mountain View, California, Google’s home.
Volvo Cars developing systems for Driver State Estimation; driver sensors
March 17, 2014
Volvo Cars is researching systems that can recognize and distinguish whether a driver is tired or inattentive. By placing a sensor on the dashboard to monitor aspects such as in which direction drivers are looking, how open their eyes are, as well as their head position and angle, it is possible to develop precise safety systems that detect a driver’s state and are able to adjust the car accordingly.
The analysis of the driver’s state, known as Driver State Estimation, in which driver sensors play an important role, is a field that may be key to self-driving cars in the future, Volvo suggests. The car will need to be able to determine for itself whether the driver is capable of taking control when the conditions for driving autonomously are no longer present. A driver sensor could be of assistance in this.
Volvo Car Group tests road-embedded magnets for accurate positioning of self-driving cars
March 11, 2014
Volvo Car Group has completed a research project using magnets embedded in the roadway to help the car determine its position. The research, which has been financed in strategic co-operation with the Swedish Transport Administration (Trafikverket), is a potential means of implementing self-driving vehicles.
Reliable and highly accurate positioning is one of the crucial issues in the development of self-driving cars. While established positioning technologies such as GPS and cameras have limitations in certain conditions, road-integrated magnets remain unaffected by physical obstacles and poor weather conditions. Accordingly, the use of road magnets has attracted some academic research, as well as a number of patents filed on different approaches.
Subaru debuts next-generation EyeSight system; three new ADAS technologies coming this year
January 24, 2014
|The EyeSight system. Click to enlarge.|
Subaru of America, Inc. has introduced a new and improved version of its EyeSight driver assistance system. The new system now features color stereo cameras which deliver an approximately 40% longer and wider detection range; brake light detection; and full functionality when the speed differential between the Eyesight-equipped car and another vehicle is up to 30 mph (48 km/h), up from 19 mph (31 km/h) previously.
Also debuting in Subaru models later this year are three additional advanced driver assistance (ADAS) technologies: blind spot detection; lane change assist; and rear cross traffic alert. These new systems will be introduced on Subaru’s product line-up starting in 2014.
Ford kicks off new automated driving research projects with MIT and Stanford University
January 22, 2014
|Ford Fusion Hybrid automated research vehicle with four LiDAR sensors. Click to enlarge.|
Building on the capabilities of the automated Ford Fusion Hybrid research vehicle unveiled last month (earlier post), Ford is working with the Massachusetts Institute of Technology (MIT) and Stanford University to research and to develop solutions to some of the technical challenges surrounding automated driving.
The MIT research focuses on scenario planning to predict actions of other vehicles and pedestrians, while Stanford is exploring how a vehicle might maneuver to allow its sensors to peek around obstructions. Put another way, the purpose of the MIT project is enhance the utilization of the line-of-sight data already acquired by the Fusion’s sensors to provide augmented predictive capability, especially for pedestrians. The purpose of the Stanford work is to enhance the acquisition of non-line-of-sight data.
TU München team develops new technique for accurate distance measurement by advanced driver assistance systems using cooperative transponders
January 21, 2014
|Basic concept for range detection using cooperative transponders. Click to enlarge.|
As part of the “cooperative transponder” research project Ko-TAG (earlier post), researchers at the Technische Universität München (TUM) developed a new approach to distance measurement to enable advanced driver assistance systems (ADAS) in cars to pinpoint the location of pedestrians and cyclists even in non-line-of-sight situations—i.e., when they are hidden from the driver’s view.
In this scheme, pedestrian’s and cyclist’s cell phones serve as transponders. On-board positioning systems compute the projected trajectory of the transponders and initiate an emergency braking sequence in case a pedestrian or cyclist moves into the path of a car.