计算机科学
人工智能
模式识别(心理学)
卷积神经网络
图形
高光谱成像
核(代数)
保险丝(电气)
像素
特征提取
理论计算机科学
数学
组合数学
电气工程
工程类
作者
Hao Zhou,Fulin Luo,Huiping Zhuang,Zhenyu Weng,Xiuwen Gong,Zhiping Lin
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:61: 1-14
被引量:67
标识
DOI:10.1109/tgrs.2023.3265879
摘要
Convolutional neural networks (CNNs) for hyperspectral image (HSI) classification have generated good progress. Meanwhile, graph convolutional networks (GCNs) have also attracted considerable attention by using unlabeled data, broadly and explicitly exploiting correlations between adjacent parcels. However, the CNN with a fixed square convolution kernel is not flexible enough to deal with irregular patterns, while the GCN using the superpixel to reduce the number of nodes will lose the pixel-level features, and the features from the two networks are always partial. In this paper, to make good use of the advantages of CNN and GCN, we propose a novel multiple feature fusion model termed attention multi-hop graph and multi-scale convolutional fusion network (AMGCFN), which includes two sub-networks of multi-scale fully CNN and multi-hop GCN to extract the multi-level information of HSI. Specifically, the multi-scale fully CNN aims to comprehensively capture pixel-level features with different kernel sizes, and a multi-head attention fusion module is used to fuse the multi-scale pixel-level features. The multi-hop GCN systematically aggregates the multi-hop contextual information by applying multi-hop graphs on different layers to transform the relationships between nodes, and a multi-head attention fusion module is adopted to combine the multi-hop features. Finally, we design a cross attention fusion module to adaptively fuse the features of two sub-networks. AMGCFN makes full use of multi-scale convolution and multi-hop graph features, which is conducive to the learning of multi-level contextual semantic features. Experimental results on three benchmark HSI datasets show that AMGCFN has better performance than a few state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI