期刊:IEEE transactions on artificial intelligence [Institute of Electrical and Electronics Engineers] 日期:2023-01-01卷期号:: 1-14
标识
DOI:10.1109/tai.2023.3296092
摘要
The K-Nearest Neighbor (KNN) classifies unlabeled samples according to the parameter $k$ , which is a user-defined constant and usually depends on prior knowledge. The selection of $k$ is crucial, as the size of the sample neighborhood affects the classification accuracy. To tackle this issue, we introduce the adaptive KNN (AKNN), which constructs a decision tree to assign different numbers of $k$ -values to different samples. In AKNN, we use the sample label information to calculate the weight between samples. Furthermore, to extend AKNN to a multi-view scenario, we propose a method namely multi-view adaptive KNN (MVAKNN), which integrates information from every single view by using the Dempster-Shafer (D-S) theory. We conduct experiments on three benchmark multi-view image datasets and the results show that MVAKNN exhibits desirable classification accuracy, outperforming some single-view and multi-view methods. Experiments with Gaussian noises show the robustness of the proposed method. The code of our paper is available at https://github.com/zzfan3/Multi-view-Adaptive-KNN .