计算机科学
嵌入
软件部署
深度学习
计算机工程
人工智能
稳健性(进化)
变压器
可靠性工程
机器学习
工程类
电气工程
电压
操作系统
基因
化学
生物化学
作者
Hairui Fang,Jin Deng,Yaoxu Bai,Feng Bo,Sheng Li,Siyu Shao,Dongsheng Chen
出处
期刊:IEEE Transactions on Instrumentation and Measurement
[Institute of Electrical and Electronics Engineers]
日期:2021-12-03
卷期号:71: 1-8
被引量:84
标识
DOI:10.1109/tim.2021.3132327
摘要
As a rising star in the field of deep learning, the Transformers have achieved remarkable achievements in numerous tasks. Nonetheless, due to the safety considerations, complex environment, and limitation of deployment cost in actual industrial production, the algorithms used for fault diagnosis often face the three challenges of limited samples, noise interference, and lightweight, which is an impediment in the fault diagnosis practice of transformer with high requirements for number of samples and parameters. For this reason, this article proposes a lightweight transformer based on convolutional embedding and linear self-attention (LSA), called CLFormer. By modifying the embedding module and the form of self-attention, the aim of lightweight is realized (MFLOPs: 0.12; Params: 4.88 K) under the condition of boosting high-accuracy of transformer. The effectiveness was demonstrated on Self-Made dataset with four comparative models, especially when each type of training sample is $6\times 4$ , the CLFormer achieves the highest average accuracy of 83.58% when the signal-to-noise ratio (SNR) is from −8 to 8 dB for three types of noise. As the first attempt to use transformer for fault diagnosis of rotating machinery, this work provides a feasible strategy for the research topic of fault diagnosis with the goal of practical deployment.
科研通智能强力驱动
Strongly Powered by AbleSci AI