Yao Liu,Bojian Chen,Dong Wang,Lin Kong,Juanjuan Shi,Changqing Shen
出处
期刊:IEEE Transactions on Instrumentation and Measurement [Institute of Electrical and Electronics Engineers] 日期:2023-01-01卷期号:72: 1-11被引量:16
标识
DOI:10.1109/tim.2023.3265739
摘要
Deep learning (DL)-based fault diagnosis models need to collect sufficient fault information for each fault type to ensure high-precision diagnosis. Some unexpected and new fault types will inevitably appear in actual conditions, which are called incremental fault types or class increment. Traditional DL models require the costly collection of all known data for retraining; while the use of new fault data may lead to catastrophic forgetting of old tasks. To solve the problem of bearing diagnosis with incremental fault types, a lifelong learning method based on generative feature replay (LLMGFR) is proposed in this study. A feature distillation method is put forward in this method to avoid forgetting in the feature extractor. The generator is trained to produce old task features. The generated features are mixed with real features of the current task to solve the imbalance problem and catastrophic forgetting of the classifier effectively. According to incremental fault diagnosis cases, LLMGFR can learn constantly and adaptively in dynamic environments with incremental fault types.