支持向量机
计算机科学
符号
计算复杂性理论
人工智能
分类器(UML)
瓶颈
数学
机器学习
趋同(经济学)
算法
算术
经济
嵌入式系统
经济增长
作者
Huajun Wang,Zhibin Zhu,Yuan‐Hai Shao
出处
期刊:IEEE transactions on systems, man, and cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2024-03-27
卷期号:54 (7): 4151-4163
被引量:4
标识
DOI:10.1109/tsmc.2024.3375021
摘要
Support vector machine (SVM) is a popular supervised machine learning classifier and has found extensive applied in many fields, including biological sciences, disease detection, health and clinical sciences, cancer classification, and more. However, the major challenge faced by SVM is its high-computational complexity, which becomes a bottleneck for large-scale SVM. To reduce computational complexity, we design a novel truncated squared loss function to get the novel SVM $(L_{{\rm tsl}}$ -SVM), and is a challenging model due to its nonconvex and nonsmooth characteristics. To solve $L_{{\rm tsl}}$ -SVM, we present new concept of proximal stationary point to establish its optimality theory. Using this theory, we then develop a novel and fast alternating direction method of multipliers in terms of low-computational complexity to address $L_{{\rm tsl}}$ -SVM and our new proposed algorithm achieve global convergence. Finally, numerical experiments have verified the superior performance of our developed method in terms of classification accuracy, number of support vectors and computational speed when compared to other eight leading solvers. For instance, when solving the real dataset with more than $10^{7}$ samples, our developed method only takes 18.89 s, significantly outperforming other solvers that require at least 589.8 s.
科研通智能强力驱动
Strongly Powered by AbleSci AI