变压器
计算机科学
编码器
振动
特征学习
人工智能
电子工程
工程类
电压
电气工程
声学
操作系统
物理
作者
Yifei Ding,Minping Jia,Qiuhua Miao,Yudong Cao
标识
DOI:10.1016/j.ymssp.2021.108616
摘要
The scope of data-driven fault diagnosis models is greatly extended through deep learning (DL). However, the classical convolution and recurrent structure have their defects in computational efficiency and feature representation, while the latest Transformer architecture based on attention mechanism has not yet been applied in this field. To solve these problems, we propose a novel time-frequency Transformer (TFT) model inspired by the massive success of vanilla Transformer in sequence processing. Specially, we design a fresh tokenizer and encoder module to extract effective abstractions from the time-frequency representation (TFR) of vibration signals. On this basis, a new end-to-end fault diagnosis framework based on time-frequency Transformer is presented in this paper. Through the case studies on bearing experimental datasets, we construct the optimal Transformer structure and verify its fault diagnosis performance. The superiority of the proposed method is demonstrated in comparison with the benchmark models and other state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI