New sensor tech for commercial Lithium-ion batteries could support >5x faster charging without compromising safety
Researchers at WMG at the University of Warwick (UK) have developed a method to assess the maximum current for commercial 18650 Li-ion batteries, using novel instrumentation methods enabling in operando measurements.
In an open-access paper in the journal Electrochimica Acta, they reported that the maximum charging current that could be safely applied to the evaluated high-energy cells is 6.7 times higher than the manufacturer-stated maximum. They developed a rapid-charging protocol that leads to a more than five-fold reduction in charging times without compromising the safety limits of the cells. The new technology could enable advances in battery materials science, flexible battery charging rates, thermal and electrical engineering of new battery materials/technology and has the potential to help the design of energy storage systems for high performance applications.
The most commonly used charging strategy is constant-current constant-voltage (CC-CV), although alternative charging modes are being explored. Unfortunately, these attempts are usually lacking experimental in-situ thermal measurement or electrode-specific data. When assessing the maximum performance limits of the cell, the risk of internal overheating resulting in catastrophic thermal runaway is greatly increased.
The commonly agreed five sources of heat generation in Li-ion cells—electrolyte, anode and cathode resistances (Re, Ra and Rc), anode material and cathode material entropy change (ΔSa and ΔSc)—can be summarised as Joule (resistive) heating and exothermic reactions. Both of these phenomena are C-rate related and accelerate when a cell experiences heavy load. Subsequently, as the cell temperature increases, the reaction rates for the decomposition of the electrolyte increase, which can lead to the electrolyte breakdown and gas formation, resulting in pressure build-up in the cell.
Additionally, if a cell is charged too fast, lithium metal can electroplate on the anode, which may grow in the form of dendrites and eventually pierce the separator, causing an internal short circuit and subsequent catastrophic failure. This is most pronounced in the case of high-energy cells which, while providing significantly more gravimetric and volumetric energy density than their high-power counterparts, suffer from significantly limited charge rates.—Amietszajew et al.
Overcoming these obstacles requires information about each electrode’s potential along with the surface and internal thermal load responses of the cell.
However, internal temperature testing (and gaining data on each electrode’s potential) in a battery has proved either impossible or impractical without significantly affecting performance. Hence, manufacturers stipulate a maximum charging rate or intensity for batteries based on what they think are the crucial temperature and potential levels to avoid.
The researchers at WMG use a novel instrumentation design that at the University of Warwick have been developing a new range of methods of that includes an in-situ reference electrode coupled with an optical fiber temperature sensor. This enables the measurement of each electrode’s potential, supplemented by the cell’s internal and external temperature profiles.
The thermal sensing method uses fiber Bragg Gratings (FBG)—an optical sensor which reflects a wavelength of light that shifts in response to variations in temperature and/or strain. The researchers threaded the fiber through an aluminum tube, forming a strain protection layer; an outer skin of fluorinated ethylene propylene heat-shrink added protection from the electrolyte.
|Schematics of the FBG sensing element embedded into a Li-ion cylindrical cell. Amietszajew et al. Click to enlarge.|
The result is a device that can have direct contact with all the key parts of the battery and withstand electrical, chemical and mechanical stress inflicted during the operation while still enabling precise temperature and potential readings.
This could bring huge benefits to areas such as motor racing which would gain obvious benefits from being able to push the performance limits, but it also creates massive opportunities for consumers and energy storage providers. Faster charging as always comes at the expense of overall battery life but many consumers would welcome the ability to charge a vehicle battery quickly when short journey times are required and then to switch to standard charge periods at other times. Having that flexibility in charging strategies might even/further down the line help consumers benefit from financial incentives from power companies seeking to balance grid supplies using vehicles connected to the grid.
This technology is ready to apply now to commercial batteries but we would need to ensure that battery management systems on vehicles, and that the infrastructure being put in for electric vehicles, are able to accommodate variable charging rates that would include these new more precisely tuned profiles/limits.—Dr. Tazdin Amietszajew, the WMG researcher who led on this research
Co-author WMG Associate Professor Dr. Rohit Bhagat said that the team is confident that similar techniques can also be developed for use in pouch cells.
The work was carried out as part of AMPLIFII, a collaborative research project supported by Innovate UK & UK Government Office for Low Emission Vehicle (contract reference 102490). The project consortium includes the University of Warwick (coordinator), Alexander Dennis Limited, Ariel Motor Company Limited, Augean plc, Axion Consulting Limited, Delta Motorsport, HORIBA MIRA Limited, Jaguar Land Rover Limited, JCB Service, Potenza Technology Limited, Trackwise Designs Limited and the University of Oxford.