可解释性
计算机科学
推论
代表(政治)
计算
特征(语言学)
人工智能
序列(生物学)
机器学习
变压器
特征学习
模式识别(心理学)
数据挖掘
算法
工程类
哲学
电气工程
电压
法学
政治
生物
遗传学
语言学
政治学
作者
Lei Ren,Haiteng Wang,Gao Huang
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-11
被引量:9
标识
DOI:10.1109/tnnls.2023.3257038
摘要
Representation learning-based remaining useful life (RUL) prediction plays a crucial role in improving the security and reducing the maintenance cost of complex systems. Despite the superior performance, the high computational cost of deep networks hinders deploying the models on low-compute platforms. A significant reason for the high cost is the computation of representing long sequences. In contrast to most RUL prediction methods that learn features of the same sequence length, we consider that each time series has its characteristics and the sequence length should be adjusted adaptively. Our motivation is that an "easy" sample with representative characteristics can be correctly predicted even when short feature representation is provided, while "hard" samples need complete feature representation. Therefore, we focus on sequence length and propose a dynamic length transformer (DLformer) that can adaptively learn sequence representation of different lengths. Then, a feature reuse mechanism is developed to utilize previously learned features to reduce redundant computation. Finally, in order to achieve dynamic feature representation, a particular confidence strategy is designed to calculate the confidence level for the prediction results. Regarding interpretability, the dynamic architecture can help human understand which part of the model is activated. Experiments on multiple datasets show that DLformer can increase up to 90% inference speed, with less than 5% degradation in model accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI