特征选择
计算机科学
降维
模式识别(心理学)
人工智能
维数之咒
特征(语言学)
统计分类
理论(学习稳定性)
数据挖掘
随机森林
k-最近邻算法
特征提取
算法
选择(遗传算法)
机器学习
哲学
语言学
作者
Zhi Wang,Yan Zhang,Zhichao Chen,Huan Yang,Yaxin Sun,Jianmin Kang,Yong Yang,Xiaojun Liang
标识
DOI:10.1109/igarss.2016.7729190
摘要
In classification, a large number of features often make it difficult to select appropriate classification features. In such situations, feature selection or dimensionality reduction methods play an important role in classification. ReliefF algorithm is one of the most successful filtering feature selection methods. In this paper, some shortcomings of the ReliefF algorithm are improved, on the problem of poor stability of neighbor samples selection, proposing the method of using the average value of multiple random selection to improve the anti-volatility of the algorithm. And redundant analysis is added to the ReliefF algorithm to eliminate the redundant features. The experimental results show that the improved ReliefF algorithm can effectively establish the classification feature sets, achieve the better classification accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI