UC Riverside reinforcement-learning-based real-time energy management system can improve PHEV efficiency by almost 12%
Researchers at the University of California, Riverside’s Bourns College of Engineering have demonstrated that a new, data-driven, reinforcement-learning-based, real-time energy management system (EMS) can improve the efficiency of current plug-in hybrid electric vehicles (PHEVs) by almost 12% compared to the standard, binary mode control strategy. The UCR EMS optimizes the power-split control in real time while learning the optimal decisions from historical driving cycles.
Further, the ~12% improvement does not factor in charging opportunities. An 8% fuel saving—again, compared to the standard binary mode EMS—can be achieved when charging opportunities are considered, the researchers said in a paper describing the system in the journal Transportation Research Record.
A PHEV’s energy management system (EMS) controls the switch from all-electric mode (charge depleting mode) to hybrid mode (charge sustaining). As new EMS devices are developed, an important consideration is combining the power streams from both sources in the most energy-efficient way.
While not all plug-in hybrids work the same way, most start in all-electric mode, running on electricity until their battery pack is depleted and then switching to hybrid mode. Known as binary mode control, this EMS strategy is easy to apply, but isn’t the most efficient way to combine the two power sources, the UCR team said.
In lab tests, blended discharge strategies, in which power from the battery is used throughout the trip, have proven to be more efficient at minimizing fuel consumption and emissions, but until now they haven’t been a realistic option for real-world applications, said Xuewei Qi, a graduate student in the Bourns College of Engineering’s Center for Environmental Research and Technology (CE-CERT) who led the research. Qi is working with CE-CERT Director Matthew Barth, a professor of electrical and computer engineering.
Blended discharge strategies have the ability to be extremely energy efficient, but those proposed previously require upfront knowledge about the nature of the trip, road conditions and traffic information, which in reality is almost impossible to provide.—Xuewei Qi
While the UCR EMS does require trip-related information, it also gathers data in real time using onboard sensors and communications devices, rather than demanding it upfront. It is one of the first systems based on a machine learning technique called reinforcement learning (RL).
In comparison-based tests on a 20-mile commute in Southern California, the UCR EMS outperformed currently available binary mode systems, with average fuel savings of 11.9%. The system gets “smarter” the more it’s used and is not model- or driver-specific—it can be applied to any PHEV driven by any individual.
In our reinforcement learning system, the vehicle learns everything it needs to be energy efficient based on historical data. As more data are gathered and evaluated, the system becomes better at making decisions that will save on energy.—Xuewei Qi
The next phase of the research will focus on creating a cloud-based network that enables PHEVs to work together for even better results.
Our current findings have shown how individual vehicles can learn from their historical driving behavior to operate in an energy efficient manner. The next step is to extend the proposed mode to a cloud-based vehicle network where vehicles not only learn from themselves but also each other. This will enable them to operate on even less fuel and will have a huge impact on the amount of greenhouse gases and other pollutants released.—Xuewei Qi
The work was done by Qi and Barth, together with Guoyuan Wu, assistant research engineer at CE-CERT; Kanok Boriboonsomsin, associate research engineer at CE-CERT; and Jeffrey Gonder, senior engineer at the National Renewable Energy Laboratory in Golden, Colo. The project was partially supported by the US Department of Transportation.
The UCR Office of Technology Commercialization has filed patents for the inventions.
Xuewei Qi, Guoyuan Wu, Kanok Boriboonsomsin, Matthew J. Barth, and Jeffrey Gonder (2016) “Data-Driven Reinforcement Learning–Based Real-Time Energy Management System for Plug-In Hybrid Electric Vehicles” Transportation Research Record: Journal of the Transportation Research Board 2572, 1-8 doi: 10.3141/2572-01