计算机科学
可解释性
降级(电信)
一般化
数据挖掘
人工智能
机器学习
数学分析
电信
数学
作者
Zian Chen,Xiaohang Jin,Ziqian Kong,Feng Wang,Zhengguo Xu
标识
DOI:10.1016/j.engappai.2023.106956
摘要
Data-driven methods routinely achieve promising results on remaining useful life prediction, but under a window-manner end-to-end paradigm, they suffer from unsatisfying generalization ability and low interpretability, as the consequence of neglecting diverse modes among the entire degradation processes of different entities. This article proposes a novel Transformer-based network, to tackle the problem by integration of global and local information. During offline training, the paired inputs containing full life and piece data are constructed, and then using cross-attention between the encoder and the decoder, the consistent position of the piece data in the full life is derived, which is directly associated with the degradation state. The designed paired inputs and model architecture ensures the strong generalization because the prediction result considering global information is adaptive to diverse degradation modes. Further, the designed cross-attention discrepancy utilizes prior knowledge of the consistent position such that similar degradation states are aligned more properly. Such a consistent position, visualized by the cross-attention distribution, is supposed to represent the intuitive relationship between degradation level and monitoring data, thus provides inherent interpretability about the prediction process. Finally, predictions of the online monitoring piece data with respect to all historical full lives with different degradation modes are aggregated to the final prediction. Extensive experiments on two datasets of turbofan and bearing show that our model provides competitive performance, especially under complicated working conditions and fault modes, achieving averagely 5.9% score reduction compared with the state-of-the-art method.
科研通智能强力驱动
Strongly Powered by AbleSci AI