计算机科学
人工智能
随机森林
机器学习
公制(单位)
阿达布思
级联
深度学习
算法
支持向量机
运营管理
色谱法
经济
化学
作者
Lev V. Utkin,Alexander A. Konstantinov,Viacheslav S. Chukanov,Anna A. Meldo
标识
DOI:10.1142/s0219622020500236
摘要
A new adaptive weighted deep forest algorithm which can be viewed as a modification of the confidence screening mechanism is proposed. The main idea underlying the algorithm is based on adaptive weigting of every training instance at each cascade level of the deep forest. The confidence screening mechanism for the deep forest proposed by Pang et al., strictly removes instances from training and testing processes to simplify the whole algorithm in accordance with the obtained random forest class probability distributions. This strict removal may lead to a very small number of training instances at the next levels of the deep forest cascade. The presented modification is more flexible and assigns weights to instances in order to differentiate their use in building decision trees at every level of the deep forest cascade. It overcomes the main disadvantage of the confidence screening mechanism. The proposed modification is similar to the AdaBoost algorithm to some extent. Numerical experiments illustrate the outperformance of the proposed modification in comparison with the original deep forest. It is also illustrated how the proposed algorithm can be extended for solving the transfer learning and distance metric learning problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI