过度拟合
计算机科学
偏最小二乘回归
回归
特征选择
回归分析
噪音(视频)
机器学习
过程(计算)
电池(电)
数据挖掘
人工智能
人工神经网络
统计
功率(物理)
数学
物理
量子力学
图像(数学)
操作系统
作者
Ting Lu,Xiaoang Zhai,Sihui Chen,Yang Liu,Jiayu Wan,Guohua Liu,Xin Li
出处
期刊:Integration
[Elsevier]
日期:2024-01-09
卷期号:96: 102136-102136
被引量:1
标识
DOI:10.1016/j.vlsi.2023.102136
摘要
—Machine learning technologies have gained significant popularity in rechargeable battery research in recent years, and have been extensively adopted to construct data-driven solutions to tackle multiple challenges for energy storage in embedded computing systems. An important application in this area is the machine learning-based battery lifetime prediction, which formulates regression models to estimate the remaining lifetimes of batteries given the measurement data collected from the testing process. Due to the non-idealities in practical operations, these measurements are usually impacted by various types of interference, thereby involving noise on both input variables and regression labels. Therefore, existing works that focus solely on minimizing the regression error on the labels cannot adequately adapt to the practical scenarios with noisy variables. To address this issue, this study adopts total least squares (TLS) to construct a regression model that achieves superior regression accuracy by simultaneously optimizing the estimation of both variables and labels. Furthermore, due to the expensive cost for collecting battery cycling data, the number of labeled data samples used for predictive modeling is often limited. It, in turn, can easily lead to overfitting, especially for TLS, which has a relatively larger set of problem unknowns to solve. To tackle this difficulty, the TLS method is investigated conjoined with stepwise feature selection in this work. Our numerical experiments based on public datasets for commercial Lithium-Ion batteries demonstrate that the proposed method can effectively reduce the modeling error by up to 11.95 %, compared against the classic baselines with consideration of noisy measurements.
科研通智能强力驱动
Strongly Powered by AbleSci AI