支持向量机
符号
边距(机器学习)
计算机科学
人工智能
机器学习
算法
数学
算术
作者
Huajun Wang,Yuan‐Hai Shao,Shenglong Zhou,Ce Zhang,Naihua Xiu
标识
DOI:10.1109/tpami.2021.3092177
摘要
Support vector machines (SVM) have drawn wide attention for the last two decades due to its extensive applications, so a vast body of work has developed optimization algorithms to solve SVM with various soft-margin losses. To distinguish all, in this paper, we aim at solving an ideal soft-margin loss SVM: $L_{0/1}$ soft-margin loss SVM (dubbed as $L_{0/1}$ -SVM). Many of the existing (non)convex soft-margin losses can be viewed as one of the surrogates of the $L_{0/1}$ soft-margin loss. Despite its discrete nature, we manage to establish the optimality theory for the $L_{0/1}$ -SVM including the existence of the optimal solutions, the relationship between them and P-stationary points. These not only enable us to deliver a rigorous definition of $L_{0/1}$ support vectors but also allow us to define a working set. Integrating such a working set, a fast alternating direction method of multipliers is then proposed with its limit point being a locally optimal solution to the $L_{0/1}$ -SVM. Finally, numerical experiments demonstrate that our proposed method outperforms some leading classification solvers from SVM communities, in terms of faster computational speed and a fewer number of support vectors. The bigger the data size is, the more evident its advantage appears.
科研通智能强力驱动
Strongly Powered by AbleSci AI