亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

A Multilevel Multimodal Fusion Transformer for Remote Sensing Semantic Segmentation

计算机科学 分割 融合 变压器 遥感 图像分割 人工智能 计算机视觉 模式识别(心理学) 地质学 工程类 电气工程 哲学 语言学 电压
作者
Xianping Ma,Xiaokang Zhang,Man-On Pun,Ming Liu
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing [Institute of Electrical and Electronics Engineers]
卷期号:62: 1-15 被引量:44
标识
DOI:10.1109/tgrs.2024.3373033
摘要

Accurate semantic segmentation of remote sensing data plays a crucial role in the success of geoscience research and applications. Recently, multimodal fusion-based segmentation models have attracted much attention due to their outstanding performance as compared to conventional single-modal techniques. However, most of these models perform their fusion operation using convolutional neural networks (CNN) or the vision transformer (Vit), resulting in insufficient local-global contextual modeling and representative capabilities. In this work, a multilevel multimodal fusion scheme called FTransUNet is proposed to provide a robust and effective multimodal fusion backbone for semantic segmentation by integrating both CNN and Vit into one unified fusion framework. Firstly, the shallow-level features are first extracted and fused through convolutional layers and shallow-level feature fusion (SFF) modules. After that, deep-level features characterizing semantic information and spatial relationships are extracted and fused by a well-designed Fusion Vit (FVit). It applies Adaptively Mutually Boosted Attention (Ada-MBA) layers and Self-Attention (SA) layers alternately in a three-stage scheme to learn cross-modality representations of high inter-class separability and low intra-class variations. Specifically, the proposed Ada-MBA computes SA and Cross-Attention (CA) in parallel to enhance intra- and cross-modality contextual information simultaneously while steering attention distribution towards semantic-aware regions. As a result, FTransUNet can fuse shallow-level and deep-level features in a multilevel manner, taking full advantage of CNN and transformer to accurately characterize local details and global semantics, respectively. Extensive experiments confirm the superior performance of the proposed FTransUNet compared with other multimodal fusion approaches on two fine-resolution remote sensing datasets, namely ISPRS Vaihingen and Potsdam. The source code in this work is available at https://github.com/sstary/SSRS.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
一行白鹭发布了新的文献求助10
5秒前
16秒前
19秒前
jia完成签到 ,获得积分10
47秒前
z掌握一下完成签到,获得积分10
1分钟前
量子星尘发布了新的文献求助10
1分钟前
一二完成签到 ,获得积分10
1分钟前
1分钟前
ZW发布了新的文献求助10
1分钟前
2分钟前
科研通AI2S应助科研通管家采纳,获得10
2分钟前
慕青应助科研通管家采纳,获得10
2分钟前
充电宝应助科研通管家采纳,获得10
2分钟前
量子星尘发布了新的文献求助10
3分钟前
3分钟前
3分钟前
会撒娇的定帮完成签到 ,获得积分10
3分钟前
科研通AI2S应助科研通管家采纳,获得10
4分钟前
量子星尘发布了新的文献求助10
5分钟前
zsmj23完成签到 ,获得积分0
5分钟前
arsenal完成签到 ,获得积分10
5分钟前
星辰大海应助哈哈哈哈哈采纳,获得10
6分钟前
小五完成签到,获得积分10
6分钟前
jzz应助科研通管家采纳,获得10
6分钟前
Orange应助科研通管家采纳,获得10
6分钟前
小二郎应助科研通管家采纳,获得10
6分钟前
量子星尘发布了新的文献求助10
6分钟前
7分钟前
如意枫叶发布了新的文献求助10
7分钟前
呆呆不呆Zz完成签到,获得积分10
7分钟前
量子星尘发布了新的文献求助10
7分钟前
8分钟前
量子星尘发布了新的文献求助10
9分钟前
GZ完成签到 ,获得积分10
9分钟前
9分钟前
hugdoggy完成签到 ,获得积分10
9分钟前
呼延夜玉发布了新的文献求助10
9分钟前
Eatanicecube完成签到,获得积分10
9分钟前
共享精神应助科研通管家采纳,获得10
10分钟前
高分求助中
【提示信息,请勿应助】关于scihub 10000
A new approach to the extrapolation of accelerated life test data 1000
Coking simulation aids on-stream time 450
北师大毕业论文 基于可调谐半导体激光吸收光谱技术泄漏气体检测系统的研究 390
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 370
Robot-supported joining of reinforcement textiles with one-sided sewing heads 360
Novel Preparation of Chitin Nanocrystals by H2SO4 and H3PO4 Hydrolysis Followed by High-Pressure Water Jet Treatments 300
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4015106
求助须知:如何正确求助?哪些是违规求助? 3555062
关于积分的说明 11317842
捐赠科研通 3288562
什么是DOI,文献DOI怎么找? 1812266
邀请新用户注册赠送积分活动 887869
科研通“疑难数据库(出版商)”最低求助积分说明 811983