New machine learning method from Stanford, with Toyota researchers, could accelerate battery development for EVs
At every stage of the battery development process, new technologies must be tested for months or even years to determine how long they will last. Now, a team led by Stanford professors Stefano Ermon and William Chueh has developed a machine learning-based method that slashes these testing times by 98%.
The study is published in the journal Nature.
Here we develop and demonstrate a machine learning methodology to efficiently optimize a parameter space specifying the current and voltage profiles of six-step, ten-minute fast-charging protocols for maximizing battery cycle life, which can alleviate range anxiety for electric-vehicle users. We combine two key elements to reduce the optimization cost: an early-prediction model, which reduces the time per experiment by predicting the final cycle life using data from the first few cycles, and a Bayesian optimization algorithm, which reduces the number of experiments by balancing exploration and exploitation to efficiently probe the parameter space of charging protocols.
Using this methodology, we rapidly identify high-cycle-life charging protocols among 224 candidates in 16 days (compared with over 500 days using exhaustive search without early prediction), and subsequently validate the accuracy and efficiency of our optimization approach.—Attia et al.
Schematic of the closed-loop optimization (CLO) system. First, batteries are tested. The cycling data from the first 100 cycles (specifically, electrochemical measurements such as voltage and capacity) are used as input for an early outcome prediction of cycle life. These cycle life predictions from a machine learning (ML) model are subsequently sent to a BO algorithm, which recommends the next protocols to test by balancing the competing demands of exploration (testing protocols with high uncertainty in estimated cycle life) and exploitation (testing protocols with high estimated cycle life).
This process iterates until the testing budget is exhausted. In this approach, early prediction reduces the number of cycles required per tested battery, while optimal experimental design reduces the number of experiments required. A small training dataset of batteries cycled to failure is used both to train the early outcome predictor and to set BO hyperparameters. In future work, design of battery materials and processes could also be integrated into this closed-loop system. Attia et al.
The study was part of a larger collaboration among scientists from Stanford, MIT and the Toyota Research Institute that bridges foundational academic research and real-world industry applications. The goal: finding the best method for charging an EV battery in 10 minutes that maximizes the battery’s overall lifetime.
The researchers wrote a program that, based on only a few charging cycles, predicted how batteries would respond to different charging approaches. The software also decided in real time what charging approaches to focus on or ignore. By reducing both the length and number of trials, the researchers cut the testing process from almost two years to 16 days.
Although the researchers focused on the testing process for extreme fast charging, the method can be applied to many other problems that are holding back battery development for months or years, said Peter Attia, who co-led the study while he was a graduate student.
Designing ultra-fast-charging batteries is a major challenge, mainly because it is difficult to make them last. The intensity of the faster charge puts greater strain on the battery, which often causes it to fail early. To prevent this damage to the battery pack—a component that accounts for a large chunk of an electric car’s total cost—battery engineers must test an exhaustive series of charging methods to find the ones that work best.
The new research sought to optimize this process. At the outset, the team saw that fast-charging optimization amounted to many trial-and-error tests—something that is inefficient for humans, but the perfect problem for a machine.
The team used this power to their advantage in two key ways. First, they used it to reduce the time per cycling experiment. In a previous study, the researchers found that instead of charging and recharging every battery until it failed—the usual way of testing a battery’s lifetime—they could predict how long a battery would last after only its first 100 charging cycles. This is because the machine learning system, after being trained on a few batteries cycled to failure, could find patterns in the early data that presaged how long a battery would last.
Second, machine learning reduced the number of methods they had to test. Instead of testing every possible charging method equally, or relying on intuition, the computer learned from its experiences to quickly find the best protocols to test.
By testing fewer methods for fewer cycles, the study’s authors quickly found an optimal ultra-fast-charging protocol for their battery. In addition to significantly speeding up the testing process, the computer’s solution was also better—and much more unusual—than what a battery scientist would likely have devised, said Ermon. Instead of charging at the highest current at the beginning of the charge, the algorithm’s solution uses the highest current in the middle of the charge.
The researchers said their approach could accelerate nearly every piece of the battery development pipeline: from designing the chemistry of a battery to determining its size and shape, to finding better systems for manufacturing and storage. This would have broad implications not only for electric vehicles but for other types of energy storage, a key requirement for making the switch to wind and solar power on a global scale.
This is a new way of doing battery development. Having data that you can share among a large number of people in academia and industry, and that is automatically analyzed, enables much faster innovation.—Patrick Herring, co-author and a scientist at the Toyota Research Institute
The study’s machine learning and data collection system will be made available for future battery scientists to freely use, Herring added. By using this system to optimize other parts of the process with machine learning, battery development—and the arrival of newer, better technologies—could accelerate by an order of magnitude or more, he said.
This work was supported by Stanford, the Toyota Research Institute, the National Science Foundation, the US Department of Energy and Microsoft.
Attia, P.M., Grover, A., Jin, N. et al. (2020) “Closed-loop optimization of fast-charging protocols for batteries with machine learning.” Nature 578, 397–402 doi: 10.1038/s41586-020-1994-5