变压器
计算机科学
蒸馏
特征提取
编码器
人工智能
数据挖掘
钥匙(锁)
机器学习
工程类
化学
计算机安全
有机化学
电压
电气工程
操作系统
作者
Diju Liu,Yalin Wang,Chenliang Liu,Xiaofeng Yuan,Kai Wang
出处
期刊:IEEE Sensors Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-12-04
卷期号:24 (2): 1792-1802
被引量:1
标识
DOI:10.1109/jsen.2023.3336789
摘要
The multi-step ahead prediction of crucial quality indicators is the cornerstone for optimizing and controlling industrial processes. The accurate multi-step ahead prediction over long prediction horizons holds great potential for improving production performance in industrial processes. However, extracting historical features presents a significant obstacle in achieving this objective. Recent advancements have demonstrated that transformer networks offer a promising technical solution to this challenge. Nevertheless, the lack of a sample simplification mechanism makes deep feature extraction difficult. It requires a lot of computational costs, which makes the traditional transformer network less applicable in industrial processes. To explore strategies to overcome these obstacles and enhance the suitability of transformer networks for effective multi-step ahead prediction, this paper proposes a novel key sample location and distillation transformer network (KSLD-TNet). Specifically, it first locates key samples with strong interactions using the attention score matrix. Then, non-key samples are filtered out layer by layer in the KSLD-TNet encoder-decoder structure. In this way, the number of input samples for each layer can be lowered exponentially, reducing the difficulty and calculation amount of deep feature extraction significantly. It is worth noting that this paper also designs an information storage structure to avoid information loss during the sample distillation process. Two industrial process datasets are utilized to construct extensive experiments to demonstrate the effectiveness of the proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI