已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Learning a 3D-CNN and Transformer prior for hyperspectral image super-resolution

高光谱成像 计算机科学 先验概率 卷积神经网络 人工智能 深度学习 变压器 正规化(语言学) 超分辨率 模式识别(心理学) 源代码 图像(数学) 操作系统 电压 贝叶斯概率 物理 量子力学
作者
Qing Ma,Junjun Jiang,Xianming Liu,Jiayi Ma
出处
期刊:Information Fusion [Elsevier]
卷期号:100: 101907-101907 被引量:6
标识
DOI:10.1016/j.inffus.2023.101907
摘要

To address the ill-posed problem of hyperspectral image super-resolution (HSISR), a commonly employed technique is to design a regularization term based on the prior information of hyperspectral images (HSIs) to effectively constrain the objective function. Traditional model-based methods that rely on manually crafted priors are insufficient in fully characterizing the properties of HSIs. Learning-based methods usually use a convolutional neural network (CNN) to learn the implicit priors of HSIs. However, the learning ability of CNN is limited, it only considers the spatial characteristics of the HSIs and ignores the spectral characteristics, and convolution is not effective for long-range dependency modeling. There is still a lot of room for improvement. In this paper, we propose a novel HSISR method that leverages the Transformer architecture instead of the CNN to learn the prior of HSIs. Specifically, we employ the proximal gradient algorithm to solve the HSISR model and simulate the iterative solution process using an unfolding network. The self-attention layer of the Transformer enables global spatial interaction, while a 3D-CNN is added behind the Transformer layers to better capture the spatio-spectral correlation of HSIs. Both quantitative and visual results on three widely used HSI datasets and the real-world dataset demonstrate that the proposed method achieves a considerable gain compared to all the mainstream algorithms including the most competitive conventional methods and the recently proposed deep learning-based methods. The source code and trained models are made publicly available at https://github.com/qingma2016/3DT-Net.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
越啊完成签到,获得积分10
2秒前
3秒前
淡然紫菜完成签到,获得积分20
5秒前
18秒前
老实土豆完成签到 ,获得积分10
19秒前
科研通AI2S应助等待的剑身采纳,获得10
20秒前
鸿鹄关注了科研通微信公众号
21秒前
昏睡的大白菜真实的钥匙完成签到,获得积分10
21秒前
青年才俊发布了新的文献求助10
21秒前
21秒前
淡然紫菜发布了新的文献求助10
22秒前
adong发布了新的文献求助10
26秒前
fangfang完成签到,获得积分10
26秒前
33秒前
33秒前
简隅完成签到,获得积分10
34秒前
FashionBoy应助宇宙的琴弦采纳,获得10
35秒前
科研通AI2S应助adong采纳,获得10
36秒前
36秒前
36秒前
Nivas发布了新的文献求助10
38秒前
爆米花应助呆呆采纳,获得10
39秒前
42秒前
善学以致用应助笑笑采纳,获得10
43秒前
46秒前
Owen应助小叶采纳,获得10
47秒前
一玮完成签到 ,获得积分10
47秒前
48秒前
ZMY发布了新的文献求助10
51秒前
52秒前
53秒前
58秒前
1分钟前
笑笑发布了新的文献求助10
1分钟前
1分钟前
1分钟前
超级铅笔完成签到,获得积分10
1分钟前
maox1aoxin应助tuanheqi采纳,获得20
1分钟前
chrissylaiiii发布了新的文献求助10
1分钟前
饼子发布了新的文献求助10
1分钟前
高分求助中
rhetoric, logic and argumentation: a guide to student writers 1000
QMS18Ed2 | process management. 2nd ed 1000
One Man Talking: Selected Essays of Shao Xunmei, 1929–1939 1000
A Chronicle of Small Beer: The Memoirs of Nan Green 1000
Understanding Autism and Autistic Functioning 950
From Rural China to the Ivy League: Reminiscences of Transformations in Modern Chinese History 900
Eric Dunning and the Sociology of Sport 850
热门求助领域 (近24小时)
化学 医学 材料科学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 免疫学 细胞生物学 电极
热门帖子
关注 科研通微信公众号,转发送积分 2915775
求助须知:如何正确求助?哪些是违规求助? 2554782
关于积分的说明 6911632
捐赠科研通 2216114
什么是DOI,文献DOI怎么找? 1177951
版权声明 588353
科研通“疑难数据库(出版商)”最低求助积分说明 576573