梯度升压
计算机科学
Boosting(机器学习)
人工智能
机器学习
计算
随机森林
集成学习
一般化
算法
可扩展性
训练集
数据挖掘
支持向量机
梯度法
模式识别(心理学)
决策树
泛化误差
统计分类
光学(聚焦)
作者
Candice Bentéjac,Anna Csörgő,Gonzalo Martínez-Muñoz
标识
DOI:10.1007/s10462-020-09896-5
摘要
XGBoost is a scalable ensemble technique based on gradient boosting that has demonstrated to be a reliable and efficient machine learning challenge solver. This work proposes a practical analysis of how this novel technique works in terms of training speed, generalization performance and parameter setup. In addition, a comprehensive comparison between XGBoost, random forests and gradient boosting has been performed using carefully tuned models as well as using the default settings. The results of this comparison may indicate that XGBoost is not necessarily the best choice under all circumstances. Finally an extensive analysis of XGBoost parametrization tuning process is carried out.
科研通智能强力驱动
Strongly Powered by AbleSci AI