渡线
计算机科学
差异进化
特征选择
最大值和最小值
选择(遗传算法)
模式识别(心理学)
算法
人工智能
钥匙(锁)
进化算法
二元分类
特征(语言学)
数据挖掘
二进制数
支持向量机
数学
机器学习
哲学
数学分析
算术
语言学
计算机安全
作者
Omid Tarkhaneh,Thanh Thi Nguyen,Samaneh Mazaheri
标识
DOI:10.1016/j.ins.2021.02.061
摘要
In classification problems, normally there exists a large number of features, but not all of them contributing to the improvement of classification performance. These redundant features make the classification problem time consuming and often result in poor performance. Feature selection methods have been proposed to reduce the number of features, minimize computational cost, and maximize classification accuracy. As a wrapper-based approach, evolutionary algorithms have been widely applied in feature subset selection tasks. However, some of them trap into local minima, especially when the number of features increases, while others are not efficient in computational time. This paper proposes a Modified Differential Evolution (DE) approach to Feature Selection (MDEFS) by utilizing two new mutation strategies to create a feasible balance between exploration and exploitation and maintain the classification performance in an acceptable range concerning both the number of features and accuracy. Some modifications are made to the standard DE crossover and its key control parameters to enhance the proposed algorithm’s capabilities further. The proposed method has been compared to several state-of-the-art methods utilizing standard datasets from the UCI repository and results of the experiments demonstrate the superiority of the proposed approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI