高光谱成像
计算机科学
人工智能
模式识别(心理学)
卷积神经网络
特征提取
利用
变压器
工程类
电气工程
电压
计算机安全
作者
Wenxuan Wang,Leiming Liu,Tianxiang Zhang,Jiachen Shen,Jing Wang,Jiangyun Li
出处
期刊:International journal of applied earth observation and geoinformation
日期:2022-09-01
卷期号:113: 103005-103005
被引量:22
标识
DOI:10.1016/j.jag.2022.103005
摘要
In recent years, convolutional neural networks have continuously dominated the downstream tasks on hyperspectral remote sensing images with its strong local feature extraction capability. However, convolution operations cannot effectively capture the long-range dependencies and repeatedly stacking convolutional layers to pursue a hierarchical structure can only make this problem alleviated but not completely solved. Meantime, the appearance of Transformer happens to cope with this problem and provides an opportunity to capture long-distance dependencies between tokens. Although Transformer has been introduced into HSI classification field recently, most of these related works only focus on exploiting a single kind of spatial or spectral information and neglect to explore the optimal fusion method for these two different-level features. Therefore, to fully exploit the abundant spatial information and spectral correlations in HSIs in a highly effective and efficient way, we present the initial attempt to explore the Transformer architecture in a dual-branch manner and propose a novel bilateral classification network named Hyper-ES2T. Besides, the Aggregated Feature Enhancement Module is proposed for effective feature aggregation and further spatial–spectral feature enhancement. Furthermore, to tackle the problem of high computational costs brought by vanilla self-attention block in Transformer, we design the Efficient Multi-Head Self-Attention block, pursuing the trade-off between model accuracy and efficiency. The proposed Hyper-ES2T reaches new state-of-the-art performance and outperforms previous methods by a significant margin on four benchmark datasets for HSI classification, which demonstrates the powerful generalization ability and superior feature representation capability of our Hyper-ES2T. It can be anticipated that this work provides a novel insight to design network architecture based on Transformer with superior performance and great model efficiency, which may inspire more following research in this direction of HSI processing field. The source codes will be available at https://github.com/Wenxuan-1119/Hyper-ES2T.
科研通智能强力驱动
Strongly Powered by AbleSci AI