计算机科学
机器学习
支持向量机
采样(信号处理)
人工智能
数据挖掘
数据集
贝叶斯概率
正规化(语言学)
计算机视觉
滤波器(信号处理)
作者
Ping Zhang,Yiqiao Jia,Youlin Shang
标识
DOI:10.1177/15501329221106935
摘要
As a new and efficient ensemble learning algorithm, XGBoost has been widely applied for its multitudinous advantages, but its classification effect in the case of data imbalance is often not ideal. Aiming at this problem, an attempt was made to optimize the regularization term of XGBoost, and a classification algorithm based on mixed sampling and ensemble learning is proposed. The main idea is to combine SVM-SMOTE over-sampling and EasyEnsemble under-sampling technologies for data processing, and then obtain the final model based on XGBoost by training and ensemble. At the same time, the optimal parameters are automatically searched and adjusted through the Bayesian optimization algorithm to realize classification prediction. In the experimental stage, the G-mean and area under the curve (AUC) values are used as evaluation indicators to compare and analyze the classification performance of different sampling methods and algorithm models. The experimental results on the public data set also verify the feasibility and effectiveness of the proposed algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI