支持向量机
稳健性(进化)
离群值
正多边形
一般化
凸函数
数学优化
凸优化
数学
真凸函数
计算机科学
凸组合
稳健回归
算法
人工智能
几何学
化学
数学分析
基因
生物化学
标识
DOI:10.1080/10556788.2011.557725
摘要
The classical support vector machines are constructed based on convex loss functions. Recently, support vector machines with non-convex loss functions have attracted much attention for their superiority to the classical ones in generalization accuracy and robustness. In this paper, we propose a non-convex loss function to construct a robust support vector regression (SVR). The introduced non-convex loss function includes several truncated loss functions as its special cases. The resultant optimization problem is a difference of convex functions program. We employ the concave–convex procedure and develop a Newton-type algorithm to solve it, which can both retain the sparseness of SVR and oppress outliers in the training samples. The experiments on both synthetic and real-world benchmark data sets confirm the robustness and effectiveness of the proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI