加权
选型
支持向量机
估计员
计算机科学
铰链损耗
协变量
交叉验证
分类器(UML)
机器学习
选择(遗传算法)
理论(学习稳定性)
多种型号
数据挖掘
人工智能
数学
数学优化
算法
统计
放射科
医学
作者
Jiahui Zou,Chaoxia Yuan,Xinyu Zhang,Guohua Zou,Alan T. K. Wan
标识
DOI:10.1007/s11222-023-10284-6
摘要
Support vector classification (SVC) is a well-known statistical technique for classification problems in machine learning and other fields. An important question for SVC is the selection of covariates (or features) for the model. Many studies have considered model selection methods. As is well-known, selecting one winning model over others can entail considerable instability in predictive performance due to model selection uncertainties. This paper advocates model averaging as an alternative approach, where estimates obtained from different models are combined in a weighted average. We propose a model weighting scheme and provide the theoretical underpinning for the proposed method. In particular, we prove that our proposed method yields a model average estimator that achieves the smallest hinge risk among all feasible combinations asymptotically. To remedy the computational burden due to a large number of feasible models, we propose a screening step to eliminate the uninformative features before combining the models. Results from real data applications and a simulation study show that the proposed method generally yields more accurate estimates than existing methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI