熵(时间箭头)
算法
计算机科学
高斯分布
数学
高斯函数
数学优化
人工智能
量子力学
物理
作者
Jiacheng He,Gang Wang,Bei Peng,Qing Sun,Zhenyu Feng,Kun Zhang
标识
DOI:10.1016/j.jfranklin.2021.12.015
摘要
Error entropy is a well-known learning criterion in information theoretic learning (ITL), and it has been successfully applied in robust signal processing and machine learning. To date, many robust learning algorithms have been devised based on the minimum error entropy (MEE) criterion, and the Gaussian kernel function is always utilized as the default kernel function in these algorithms, which is not always the best option. To further improve learning performance, two concepts using a mixture of two Gaussian functions as kernel functions, called mixture error entropy and mixture quantized error entropy, are proposed in this paper. We further propose two new recursive least-squares algorithms based on mixture minimum error entropy (MMEE) and mixture quantized minimum error entropy (MQMEE) optimization criteria. The convergence analysis, steady-state mean-square performance, and computational complexity of the two proposed algorithms are investigated. In addition, the reason why the mixture mechanism (mixture correntropy and mixture error entropy) can improve the performance of adaptive filtering algorithms is explained. Simulation results show that the proposed new recursive least-squares algorithms outperform other RLS-type algorithms, and the practicality of the proposed algorithms is verified by the electro-encephalography application.
科研通智能强力驱动
Strongly Powered by AbleSci AI