UC Riverside reinforcement-learning-based real-time energy management system can improve PHEV efficiency by almost 12%
09 February 2016
Researchers at the University of California, Riverside’s Bourns College of Engineering have demonstrated that a new, data-driven, reinforcement-learning-based, real-time energy management system (EMS) can improve the efficiency of current plug-in hybrid electric vehicles (PHEVs) by almost 12% compared to the standard, binary mode control strategy. The UCR EMS optimizes the power-split control in real time while learning the optimal decisions from historical driving cycles.
Further, the ~12% improvement does not factor in charging opportunities. An 8% fuel saving—again, compared to the standard binary mode EMS—can be achieved when charging opportunities are considered, the researchers said in a paper describing the system in the journal Transportation Research Record.
A PHEV’s energy management system (EMS) controls the switch from all-electric mode (charge depleting mode) to hybrid mode (charge sustaining). As new EMS devices are developed, an important consideration is combining the power streams from both sources in the most energy-efficient way.
While not all plug-in hybrids work the same way, most start in all-electric mode, running on electricity until their battery pack is depleted and then switching to hybrid mode. Known as binary mode control, this EMS strategy is easy to apply, but isn’t the most efficient way to combine the two power sources, the UCR team said.
In lab tests, blended discharge strategies, in which power from the battery is used throughout the trip, have proven to be more efficient at minimizing fuel consumption and emissions, but until now they haven’t been a realistic option for real-world applications, said Xuewei Qi, a graduate student in the Bourns College of Engineering’s Center for Environmental Research and Technology (CE-CERT) who led the research. Qi is working with CE-CERT Director Matthew Barth, a professor of electrical and computer engineering.
Blended discharge strategies have the ability to be extremely energy efficient, but those proposed previously require upfront knowledge about the nature of the trip, road conditions and traffic information, which in reality is almost impossible to provide.
—Xuewei Qi
While the UCR EMS does require trip-related information, it also gathers data in real time using onboard sensors and communications devices, rather than demanding it upfront. It is one of the first systems based on a machine learning technique called reinforcement learning (RL).
In comparison-based tests on a 20-mile commute in Southern California, the UCR EMS outperformed currently available binary mode systems, with average fuel savings of 11.9%. The system gets “smarter” the more it’s used and is not model- or driver-specific—it can be applied to any PHEV driven by any individual.
In our reinforcement learning system, the vehicle learns everything it needs to be energy efficient based on historical data. As more data are gathered and evaluated, the system becomes better at making decisions that will save on energy.
—Xuewei Qi
The next phase of the research will focus on creating a cloud-based network that enables PHEVs to work together for even better results.
Our current findings have shown how individual vehicles can learn from their historical driving behavior to operate in an energy efficient manner. The next step is to extend the proposed mode to a cloud-based vehicle network where vehicles not only learn from themselves but also each other. This will enable them to operate on even less fuel and will have a huge impact on the amount of greenhouse gases and other pollutants released.
—Xuewei Qi
The work was done by Qi and Barth, together with Guoyuan Wu, assistant research engineer at CE-CERT; Kanok Boriboonsomsin, associate research engineer at CE-CERT; and Jeffrey Gonder, senior engineer at the National Renewable Energy Laboratory in Golden, Colo. The project was partially supported by the US Department of Transportation.
The UCR Office of Technology Commercialization has filed patents for the inventions.
Resources
Xuewei Qi, Guoyuan Wu, Kanok Boriboonsomsin, Matthew J. Barth, and Jeffrey Gonder (2016) “Data-Driven Reinforcement Learning–Based Real-Time Energy Management System for Plug-In Hybrid Electric Vehicles” Transportation Research Record: Journal of the Transportation Research Board 2572, 1-8 doi: 10.3141/2572-01
Probably a very low cost way to improve performance of future PHEVs, HEVs, BEVs and FCEVs?.
Posted by: HarveyD | 09 February 2016 at 09:34 AM
Harvcy, it ain't gonna improve BEVs, because there is no other power source to blend in.
Posted by: Davemart | 09 February 2016 at 09:53 AM
All electrified vehicles, with driver assistance program, could use less energy with the proper computer program?
In other words, computers can mitigate/stop bad/hotheaded driving habits from most human drivers.
Posted by: HarveyD | 09 February 2016 at 11:43 AM
Harvey:
Yeah, computers may reduce fuel use by, for instance,optimising for terrain.
That has not got much to do with this technique though, which is about optimising the balance between two different drive train sources.
Posted by: Davemart | 09 February 2016 at 12:59 PM
Sometimes ambiguous? wording leads to something.
If the charging opportunities were whilst plugged in to the grid, there could be another efficiency gain from grid services and low spot pricing.
It wouldn't use less but would utilise it more effectively.
That is well enough understood. If not on the same ledger.
"Further, the ~12% improvement does not factor in charging opportunities. An 8% fuel saving—again, compared to the standard binary mode EMS—can be achieved when charging opportunities are considered,"
Posted by: Arnold | 09 February 2016 at 08:16 PM
So how's this work with congestion zones where you're charged if you don't run in EV?
Posted by: Harrod | 10 February 2016 at 01:31 AM
One has to wonder how anyone could tell, Harrod.
On the other hand, given the learning algorithms and GPS, it doesn't seem too difficult for the charge management algorithm to reserve battery power for the segments in the congestion zone.
Posted by: Engineer-Poet | 10 February 2016 at 06:48 AM
This concept seems like it might work for lower energy capacity parallel PHEVs such as the plug-in Prius but it would not work for the higher energy capacity and higher power series PHEV Volt. It is obvious that the testing was not done with a Volt as it would never have needed the ICE during a 20 mile commute. The paper should have stated what vehicles that they used but it seems to be a typical academic project that was not that well thought out.
Posted by: sd | 10 February 2016 at 12:20 PM