淡出
电池(电)
模型预测控制
锂离子电池
降级(电信)
荷电状态
电压
计算机科学
充电周期
锂(药物)
汽车工程
控制(管理)
可靠性工程
工程类
模拟
电气工程
功率(物理)
电信
汽车蓄电池
人工智能
物理
操作系统
量子力学
作者
Gyuyeong Hwang,Niranjan Sitapure,Jiyoung Moon,H. Lee,Sungwon Hwang,Joseph Kwon
标识
DOI:10.1016/j.cej.2022.134768
摘要
Recently, given the high demand of electric vehicles, the implementation of a battery management system (BMS) for efficient energy use, safety, and state of health estimation has garnered significant attention. For a robust BMS, the battery model which can help the monitoring and control of battery behaviors such as voltage, temperature, stress, and capacity fade should have a high accuracy. Existing battery models like single-particle model (SPM), and pseudo-two-dimensional models have either shown a mismatch with experiments or have a large computational time, both of which are not conducive to fast control of BMS. Furthermore, since existing enhanced SPMs in conjunction with classical and even advanced control methodologies can only elucidate empirically estimated inter-cycle capacity fade, they cannot be applied to intra-cycle control of battery charging. To handle these concerns, in this work, a new battery model is constructed by integrating the enhanced SPM with the first-principled chemical/mechanical degradation physics to accurately predict dynamic intra-cycle capacity fade. Subsequently, the proposed battery model is incorporated into a model predictive control framework to manipulate the applied current to minimize the capacity fade during the charging of a battery. Overall, the developed framework (a) allowed the accurate prediction of both inter-cycle and intra-cycle chemical/mechanical degradation, and the state of the battery (i.e., voltage, temperature, and mechanical stress); (b) enabled experimental model validation at different operation conditions; and (c) yielded a superior input current profile, which minimized the intra-cycle capacity fade, as compared to the traditional constant current-constant voltage (CC-CV) charging protocol.
科研通智能强力驱动
Strongly Powered by AbleSci AI