透视图(图形)
小波
样品(材料)
散射
声学
结构工程
计算机科学
工程类
人工智能
光学
物理
热力学
作者
Li Wang,Wentao Mao,Yanna Zhang,Panpan Zeng,Zhidan Zhong
标识
DOI:10.1177/1748006x241272827
摘要
As a critical issue of diagnostics and health management (PHM), health indicator (HI) construction aims to describe the degradation process of bearings and can provide essential support of domain knowledge for early fault detection and remaining useful life prediction. In recent years, various deep neural networks, with end-to-end modeling capability, have been successfully applied to the HI construction for rolling bearings. In small-sample environment, however, the degradation features would not be extracted well by deep learning techniques, which may raise insufficient tendency and monotonicity characteristics in the obtained HI sequence. To address this concern, this paper proposes a HI construction method based on wavelet scattering network (WSN) and makes an empirical evaluation from frequency perspective. First, degradation features in different frequency bands are extracted from vibration signals by using WSN to expand the feature space with different scales and orientations. Second, the frequency band with the optimal scale and orientation parameters is selected by calculating the dynamic time wrapping (DTW) distance between the feature sequences of each frequency band and the root mean square (RMS) sequence. With the feature subset from the determined frequency band, the HI sequence can be built by means of principal component analysis (PCA). Experimental results on the IEEE PHM Challenge 2012 bearing dataset show that the proposed method can work well with only a small amount of bearing whole-life data in obtaining the HI sequences with high monotonicity and correlation characteristics. More interestingly, the critical frequency band whose information supports decisively the HI construction can be clarified, raising interpretability in a frequency sense and enhancing the credibility of the obtained HI sequence as well.
科研通智能强力驱动
Strongly Powered by AbleSci AI