特征选择
计算机科学
人工智能
选择(遗传算法)
模式识别(心理学)
特征(语言学)
数据挖掘
数学
机器学习
语言学
哲学
标识
DOI:10.1016/j.neucom.2024.127319
摘要
In hierarchical classification learning, the hierarchical feature selection algorithm plays an important role in overcoming the curse of dimensionality. Existing hierarchical feature selection algorithms, based on the granular computing framework, all use three basic search strategies to search for similar and dissimilar classes. These strategies compute the importance of features to the global label for feature selection. However, existing methods based on the sibling strategy can only stay at the fine-grained level for feature selection, often without considering that the fine-grained level is also continuously separated from the coarse-grained level. Thus, these methods do not take into account the features hidden below the coarse granularity, resulting in the selection of a top-heavy subset of features and the loss of many important features. Therefore, this paper proposes a Hierarchical Feature Selection Based on Neighborhood Interclass Spacing From Fine to Coarse (HFSNIS) algorithm, which aims to change the feature selection to the coarse-grained hierarchy. The framework of the HFSNIS algorithm is as follows: First, each fine-grained leaf node is coarsened to the coarsest hierarchy of granularity from fine to coarse, where the non-root ancestor node is located. Next, the search for similar and dissimilar nearest neighbors is performed at the coarsest granularity hierarchy. Finally, the features are filtered using the Neighborhood Interclass Spacing model to obtain a subset of features. Therefore, this HFSNIS algorithm based on the Coarsest Search Strategy (CSS) can reselect features that were previously ignored in the fine-grained hierarchy, resulting in a better feature subset. Finally, the proposed algorithm outperforms seven state-of-the-art feature selection algorithms on six datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI