支持向量机
超参数
计算机科学
核(代数)
机器学习
人工智能
一般化
集成学习
径向基函数核
相关向量机
核方法
集合预报
回归
多项式核
最小二乘支持向量机
随机森林
数学
统计
组合数学
数学分析
作者
Anderson Ara,Mateus Maia,Francisco Louzada,Samuel Macêdo
标识
DOI:10.1016/j.eswa.2022.117107
摘要
Machine learning techniques have one of their main objectives to reduce the generalized prediction error. Support vector models have as a main challenge the choice of an appropriate kernel function, as well as the estimation of its hyperparameters. Such procedures are usually performed through some tests and tuning processes which require a high computational performance. In contrast, ensemble methods present a good approach to combine several models which result in a greater predictive capacity. In this paper, we propose a new ensemble method to support vector regression, namely regression random machines. The proposed method eliminates the need to choose the best kernel function during the tuning process using a random mixture of kernel functions combined with a properly bagging ensemble which considers the strength and agreement of the single models. The results demonstrated a good predictive performance through lower generalization error which overlaps the single and bagged versions of support vector models with different kernels. The usefulness of the proposed method is illustrated by simulation studies that were realized over eight artificial scenarios and twenty-seven real-world applications.
科研通智能强力驱动
Strongly Powered by AbleSci AI