变压器
计算机科学
人工智能
注意力网络
数据挖掘
工程类
机器学习
模式识别(心理学)
电压
电气工程
作者
Jie Li,Yu Bao,Wenxin Liu,P.S. Ji,Lekang Wang,Zhongbing Wang
出处
期刊:Measurement
[Elsevier]
日期:2023-10-13
卷期号:223: 113687-113687
被引量:8
标识
DOI:10.1016/j.measurement.2023.113687
摘要
Due to the inherent shortcomings of traditional depth models, the Transformer model based on the self-attention mechanism has become popular in the field of fault diagnosis. The current Transformer's self-attentive mechanism provides an alternative way of thinking, which can make direct association between each signal. However, it can only focus on the association information within a sequence, and it is difficult to understand the information gap between samples. Therefore, this paper proposes the two-branch Twins attention, which for the first time uses cross-attention to focus on information associations between samples. Twins attention uses cross-attention to learn information associations between samples in addition to retaining the information associations within sequences learned by self-attention. The performance of the proposed model was validated on four popular bearing datasets. Compared to the original transformer structure, the average accuracy of each dataset improved by 1.73% to 99.42%, leading the noise experiments.
科研通智能强力驱动
Strongly Powered by AbleSci AI