随机共振
信号(编程语言)
背景(考古学)
噪音(视频)
计算机科学
信号处理
估计理论
信噪比(成像)
控制理论(社会学)
算法
人工智能
数字信号处理
生物
电信
计算机硬件
图像(数学)
古生物学
程序设计语言
控制(管理)
作者
Cailiang Zhang,Zhihui Lai,Zhisheng Tu,Xu‐Yun Hua,Yong Chen,Ronghua Zhu
标识
DOI:10.1016/j.apacoust.2023.109753
摘要
Weak signal detection methods based on stochastic resonance (SR) have been extensively studied due to their capability to utilize noise energy for enhancing weak signals. Among various SR models, the second-order tri-stable SR models have demonstrated their superiority in weak-signal detection with better output performance compared to other SR models. To optimize the output performance of the second-order tri-stable systems, a variety of parameter optimization methods have been proposed to optimize the parameters of the system. However, these optimization methods often require to optimize multiple parameters, which leads to an increase in computational costs and reduces the real-time processing efficiency of signal processing. Such multi-parameter optimization methods cannot meet the demands for timely signal processing in the context of big data. To address this challenge, this paper proposes two single-parameter-adjusting SR models. The proposed models can attain an ideal output performance by adjusting a single parameter in the SR system. The spectral amplification as an indicator is derived to quantitatively analyze the effects of the proposed models on SR output. On this basis, the influences of the proposed models on the SR output under different potential well parameters, noise intensities, signal frequency, and damping ratio are fully investigated through numerical simulations. At last, the proposed models are employed to process an experimental signal with a weak fault feature, and the experimental results verify the feasibility of the proposed models in optimizing the SR output. The research results can guide the design of tri-stable SR models and support the application of the SR-based signal processing model in the context of big data.
科研通智能强力驱动
Strongly Powered by AbleSci AI