鉴别器
计算机科学
发电机(电路理论)
蒸馏
断层(地质)
过程(计算)
生成对抗网络
人工智能
算法
计算机工程
深度学习
功率(物理)
程序设计语言
化学
电信
物理
有机化学
量子力学
探测器
地震学
地质学
作者
Hongyu Zhong,Samson S. Yu,Hieu Trinh,Rui Yuan,Yong Lv,Yanan Wang
标识
DOI:10.1088/1361-6501/ad0fd2
摘要
Abstract Generative adversarial networks (GANs) have shown promise in the field of small sample fault diagnosis. However, it is worth noting that generating synthetic data using GANs is time-consuming, and synthetic data cannot fully replace real data. To expedite the GAN-based fault diagnostics process, this paper proposes a hybrid lightweight method for compressing GAN parameters. First, three modules are constructed: a teacher generator, a teacher discriminator, and a student generator, based on the knowledge distillation GAN (KD-GAN) approach. The distillation operation is applied to both teacher generator and student generator, while adversarial training is conducted for the teacher generator and the teacher discriminator. Furthermore, a joint loss function is proposed to update the parameters of the student generator by combining distillation loss and adversarial loss. Additionally, the proposed KD-GAN method is combined with deep transfer learning (DTL) and leverages real data to enhance the diagnostic model’s performance. Two numerical experiments are performed to demonstrate that the proposed KD-GAN-DTL method outperforms other GAN-based fault diagnosis methods in terms of computational time and diagnostic accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI