特征(语言学)
加权
粒子群优化
支持向量机
特征向量
核(代数)
模式识别(心理学)
一般化
计算机科学
人工智能
约束(计算机辅助设计)
数学
算法
哲学
数学分析
放射科
组合数学
医学
语言学
几何学
作者
Minghua Xie,Lili Xie,Peidong Zhu
摘要
Support vector regression (SVR) is a powerful kernel-based method which has been successfully applied in regression problems. Regarding the feature-weighted SVR algorithms, its contribution to model output has been taken into account. However, the performance of the model is subject to the feature weights and the time consumption on training. In the paper, an efficient feature-weighted SVR is proposed. Firstly, the value constraint of each weight is obtained according to the maximal information coefficient which reveals the relationship between each input feature and output. Then, the constrained particle swarm optimization (PSO) algorithm is employed to optimize the feature weights and the hyperparameters simultaneously. Finally, the optimal weights are used to modify the kernel function. Simulation experiments were conducted on four synthetic datasets and seven real datasets by using the proposed model, classical SVR, and some state-of-the-art feature-weighted SVR models. The results show that the proposed method has the superior generalization ability within acceptable time.
科研通智能强力驱动
Strongly Powered by AbleSci AI