基准标记
最小均方误差
熵(时间箭头)
数学
稳健性(进化)
算法
计算机科学
均方误差
高斯分布
固定点
人工智能
数学优化
统计
估计员
量子力学
生物化学
基因
物理
数学分析
化学
作者
Yuqing Xie,Yingsong Li,Yuantao Gu,Jiuwen Cao,Badong Chen
标识
DOI:10.1109/tsp.2020.3001404
摘要
Compared with traditional learning criteria, such as minimum mean square error (MMSE), the minimum error entropy (MEE) criterion has received increasing attention in the domains of nonlinear and non-Gaussian signal processing and machine learning. Since the MEE criterion is shift-invariant, one has to add a bias to achieve zero-mean error over training datasets. Thus, a modification of the MEE called minimization of error entropy with fiducial points (MEEF) was proposed, which controls the bias for MEE in a more elegant and efficient way. In the present paper, we propose a fixed-point minimization of error entropy with fiducial points (MEEF-FP) as an alternative to the gradient based MEEF for training a linear-in-parameters (LIP) model because of its fast convergence speed, robustness and step-size free. Also, we provide a sufficient condition that guarantees the convergence of the MEEF-FP algorithm. Moreover, we develop a recursive MEEF-FP (RMEEF-FP) for online adaptive learning with low-complexity. Finally, illustrative examples are presented to show the excellent performance of the new methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI