Sensor fusion, machine learning, and “big data” featuring in Ford R&D for advanced driver assistance
At Ford, Paul Mascarenas, vice president and chief technical officer, has been leading the team researching and developing new technologies for Ford vehicles, particularly in the area of driver assistance and mobile device connectivity. Mascarenas points to the new Fusion sedan as an example of “making the car smarter using attainable and affordable technology and thus helping create a better driver.”
However, he suggests, despite the “unprecedented” level of sensors for its driver assist technologies, machine learning techniques to deliver more electric-only driving on the hybrids, and innovative graphical interfaces to help coach drivers to be as fuel efficient as possible, the Fusion is only scratching the surface of what is possible.
With more than 145 actuators, 4,716 signals, and 74 sensors including radar, sonar, cameras, accelerometers, temperature and even rain sensors, the 2013 Fusion can monitor the perimeter around the car and see into places that are not readily visible from the driver’s seat. These sensors produce more than 25 gigabytes of data per hour which is analyzed by more than 70 on-board computers. The actuators combined with signal information from the driver assist sensors can alert the driver to potential dangers, and actively assist with parking and lane keeping.
In the Fusion, we have sensors and actuators that act independently as part of the assist features. The next phase, currently in research, involves sensor fusion [earlier post, earlier post], where engineers learn how to more comprehensively characterize the environment by blending multiple signals, and add externally available information through cloud connectivity.—Paul Mascarenas
According to Mascarenas’ predictions, top areas for car technology innovation in the coming years will include:
“Big data” analysis and intelligent decision making. Ford is researching the use of real-time sensor data—radar and camera-based—that can help evaluate external factors affecting driver attention, such as traffic congestion, and thus limit potential distractions such as an incoming phone call.
Upgradeable, customizable hardware. Ford’s OpenXC research platform looks at the potential for open-source, community-driven innovation of plug-and-play hardware modules that provide infinite opportunities for rapid customization.
Seamless integration across cloud ecosystems. Ford SYNC has an open, agnostic platform strategy that has allowed for adoption and compatibility with the burgeoning mobile ecosystem; the next step is to do the same for the consumer shift toward cloud-based services.
Advanced machine learning. The new Fusion and C-MAX Energi plug-in hybrids utilize EV+, a feature that learns the typical locations of charging, such as home and office, and then automatically maximizes electric-only driving mode when nearing those locations. (Earlier post.)
Biometrics. Ford is researching biometric sensors, such as those embedded in a car seat, to measure stress levels for a more personalized response from driver assist technologies, because skill levels—and thus stress—can vary in certain situations.
Prediction. Ford researchers are looking at ways to predict driver behavior, such as a driver’s destination based on prior history, to help optimize and configure vehicle controls for improved performance such as better energy management.
Rapid data authentication. Ford sees significant potential in vehicle-to-vehicle communications and is actively researching the technology globally, including advanced Wi-Fi with rapid authentication capability so that cars can exchange information quickly and securely, helping drivers avoid potential collisions. (Earlier post.)
All of these areas of research are well within our reach. The key to readiness and implementation in Ford vehicles is ensuring the customer experience of these technology features trumps the technology itself.—Paul Mascarenas