自回归模型
稳健性(进化)
计算机科学
变压器
数据挖掘
人工智能
组分(热力学)
机器学习
可靠性工程
工程类
计量经济学
电压
数学
物理
生物化学
化学
电气工程
基因
热力学
标识
DOI:10.1016/j.ress.2023.109306
摘要
Predictive Maintenance (PdM) plays a pivotal role in safety management by planning necessary maintenance in advance to avoid future serious breakdown. Predicting the Remaining useful life (RUL) based on historical running data is an important task in PdM. One challenge of this issue is to capture both the temporal and spatial complex patterns especially in ultra-long sequences. Recent studies have demonstrated the superiority of Transformer model in capturing long-term dependencies. However, in the research field of PdM, the canonical Transformer faces with difficulties to deploy due to its limited input length, neglect of local correlations, insensitivity to input pattern and high computational cost. To tackle this, a novel lightweight RUL prediction model called TCNASA integrating temporal convolution network (TCN), accumulative self-attention layer (ASA) and autoregressive component is proposed. It uses TCN firstly to capture local correlations, prunes the redundant short-term cases when matching pairs in attention layers, accumulates global patterns through stacked self-attention layers, and lastly integrates an autoregressive component to enhance the robustness. The experimental results on several real-world PdM datasets have verified the effectiveness and efficiency of the proposed TCNASA model.
科研通智能强力驱动
Strongly Powered by AbleSci AI