亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Model-Informed Multistage Unsupervised Network for Hyperspectral Image Super-Resolution

高光谱成像 可解释性 计算机科学 深度学习 人工智能 一般化 图像(数学) 多光谱图像 模式识别(心理学) 数据挖掘 数学 数学分析
作者
Jiaxin Li,Ke Zheng,Lianru Gao,Li Ni,Min Huang,Jocelyn Chanussot
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing [Institute of Electrical and Electronics Engineers]
卷期号:62: 1-17 被引量:191
标识
DOI:10.1109/tgrs.2024.3391014
摘要

By fusing a low-resolution hyperspectral image (LrMSI) with an auxiliary high-resolution multispectral image (HrMSI), hyperspectral image super-resolution (HISR) can generate a high-resolution hyperspectral image (HrHSI) economically. Despite the promising performance achieved by deep learning (DL), there are still two challenges remaining to be solved. First, most DL-based methods heavily rely on large-scale training triplets, which reduces them to limited generalization and poor practicability in real-world scenarios. Second, existing methods pursue higher performance by designing complex structures from off-the-shelf components while ignoring inherent information from the degradation model, hence leading to insufficient integration of domain knowledge and lower interpretability. To address those drawbacks, we propose a model-informed multi-stage unsupervised network, M2U-Net for short, by leveraging both deep image prior (DIP) and degradation model information. Generally, M2U-Net is built with a three-stage scheme, i.e., degradation information learning (DIL), initialized image establishment (IIE), and deep image generation (DIG) stages. The first stage is to exploit the deep information of the degradation model via a tiny network whose parameters and outputs will serve as guidance for the following two stages. Instead of feeding uninformed noise as input for stage three, IIE stage aims to establish an initialized input with expressive HrHSI-relevant information by resorting to a spectral mapping learning network, thus facilitating the extraction of prior information and further magnifying the potential of DIP for high-quality reconstruction. Last, we propose a dual U-shape network as a powerful regularizer to capture image statistics, in which two U-Nets are coupled together by cross-attention guidance (CAG) module to separately achieve spatial feature extraction and final image generation. The CAG module can incorporate abundant spatial information into the reconstruction process and hence guide the network toward a more plausible generation. Extensive experiments demonstrate the effectiveness of our proposed M2U-Net in terms of quantitative evaluation and visual quality. The code will be available at https://github.com/JiaxinLiCAS.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
科研通AI2S应助科研通管家采纳,获得10
2秒前
Owen应助清爽芭乐提采纳,获得10
36秒前
45秒前
51秒前
57秒前
华仔应助体贴的手链采纳,获得10
1分钟前
1分钟前
1分钟前
Jasper应助清爽芭乐提采纳,获得10
1分钟前
科研通AI6.2应助Snow886采纳,获得10
1分钟前
SciGPT应助科研通管家采纳,获得10
2分钟前
2分钟前
2分钟前
岸在海的深处完成签到 ,获得积分0
2分钟前
2分钟前
深情洪纲发布了新的文献求助10
2分钟前
清爽芭乐提完成签到,获得积分10
2分钟前
2分钟前
2分钟前
2分钟前
科研通AI2S应助Sam采纳,获得10
3分钟前
3分钟前
3分钟前
嘻嘻哈哈应助Sam采纳,获得30
3分钟前
昂帕帕斯发布了新的文献求助10
3分钟前
3分钟前
Snow886发布了新的文献求助10
3分钟前
iman完成签到,获得积分10
3分钟前
4分钟前
深情洪纲发布了新的文献求助10
4分钟前
ding应助Snow886采纳,获得10
4分钟前
5分钟前
Snow886发布了新的文献求助10
5分钟前
Snow886完成签到,获得积分10
5分钟前
Everything完成签到,获得积分10
5分钟前
5分钟前
hnxxangel发布了新的文献求助10
5分钟前
爆米花应助hnxxangel采纳,获得10
5分钟前
5分钟前
yuyuan发布了新的文献求助10
5分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Lewis’s Child and Adolescent Psychiatry: A Comprehensive Textbook Sixth Edition 2000
Wolffs Headache and Other Head Pain 9th Edition 1000
Continuing Syntax 1000
Signals, Systems, and Signal Processing 510
荧光膀胱镜诊治膀胱癌 500
First trimester ultrasound diagnosis of fetal abnormalities 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6223422
求助须知:如何正确求助?哪些是违规求助? 8048710
关于积分的说明 16779438
捐赠科研通 5308143
什么是DOI,文献DOI怎么找? 2827681
邀请新用户注册赠送积分活动 1805712
关于科研通互助平台的介绍 1664844