编码器
卷积神经网络
计算机科学
循环神经网络
利用
变压器
人工智能
核(代数)
计算
模式识别(心理学)
机器学习
深度学习
人工神经网络
算法
工程类
电压
操作系统
电气工程
组合数学
计算机安全
数学
作者
Mo Yu,Qianhui Wu,Xiu Li,Biqing Huang
标识
DOI:10.1007/s10845-021-01750-x
摘要
Remaining Useful Life (RUL) estimation is a fundamental task in the prognostic and health management (PHM) of industrial equipment and systems. To this end, we propose a novel approach for RUL estimation in this paper, based on deep neural architecture due to its great success in sequence learning. Specifically, we take the Transformer encoder as the backbone of our model to capture short- and long-term dependencies in a time sequence. Compared with convolutional neural network based methods, there is no limitation from the kernel size for a complete receptive field of all time steps. While compared with recurrent neural network based methods, we develop our model based on dot-product self-attention, enabling it to fully exploit parallel computation. Moreover, we further propose a gated convolutional unit to facilitate the model’s ability of incorporating local contexts at each time step, for the attention mechanism used in the Transformer encoder makes the output high-level features insensitive to local contexts. We conduct experiments on the C-MAPSS datasets and show that, the performance of our model is superior or comparable to those of other existing methods. We also carry out ablation studies and demonstrate the necessity and effectiveness of each component used in the proposed model.
科研通智能强力驱动
Strongly Powered by AbleSci AI