离群值
稳健性(进化)
规范(哲学)
数学
特征向量
模式识别(心理学)
人工智能
分类器(UML)
马氏距离
支持向量机
计算机科学
缩小
均方误差
算法
数学优化
统计
化学
法学
物理
基因
量子力学
生物化学
政治学
作者
He Yan,Liyong Fu,Tian’an Zhang,Jun Hu,Qiaolin Ye,Yong Qi,Dong‐Jun Yu
标识
DOI:10.1016/j.patcog.2022.108779
摘要
Proximal support vector machine via generalized eigenvalues (GEPSVM) is one of the most successful methods for classification problems. However, GEPSVM is vulnerable to outliers since it learns classifiers based on the squared L 2 -norm distance without a specific strategy to deal with the outliers. Motivated by existing studies that improve the robustness of GEPSVM via the L 1 -norm distance or not-squared L 2 -norm distance formulation, a novel GEPSVM formulation that minimizes the p -order of L 2 -norm distance is proposed, namely, L 2,p -GEPSVM. This formulation weakens the negative effects of both light and heavy outliers in the data. An iterative algorithm is designed to solve the general L 2,p -norm distance minimization problems and rigorously prove its convergence. In addition, we adjust the parameters of L 2,p -GEPSVM to balance the accuracy and training time. This is especially useful for larger datasets. Extensive results indicate that the L 2,p -GEPSVM improves the classification performance and robustness in various experimental settings.
科研通智能强力驱动
Strongly Powered by AbleSci AI