已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

RKformer: Runge-Kutta Transformer with Random-Connection Attention for Infrared Small Target Detection

计算机科学 人工智能 编码器 像素 地点 变压器 模式识别(心理学) 算法 理论计算机科学 语言学 量子力学 操作系统 物理 哲学 电压
作者
Mingjin Zhang,Haichen Bai,Jing Zhang,Rui Zhang,Chaoyue Wang,Jie Guo,Xinbo Gao
标识
DOI:10.1145/3503161.3547817
摘要

Infrared small target detection (IRSTD) refers to segmenting the small targets from infrared images, which is of great significance in practical applications. However, due to the small scale of targets as well as noise and clutter in the background, current deep neural network-based methods struggle in extracting features with discriminative semantics while preserving fine details. In this paper, we address this problem by proposing a novel RKformer model with an encoder-decoder structure, where four specifically designed Runge-Kutta transformer (RKT) blocks are stacked sequentially in the encoder. Technically, it has three key designs. First, we adopt a parallel encoder block (PEB) of the transformer and convolution to take their advantages in long-range dependency modeling and locality modeling for extracting semantics and preserving details. Second, we propose a novel random-connection attention (RCA) block, which has a reservoir structure to learn sparse attention via random connections during training. RCA encourages the target to attend to sparse relevant positions instead of all the large-area background pixels, resulting in more informative attention scores. It has fewer parameters and computations than the original self-attention in the transformer while performing better. Third, inspired by neural ordinary differential equations (ODE), we stack two PEBs with several residual connections as the basic encoder block to implement the Runge-Kutta method for solving ODE, which can effectively enhance the feature and suppress noise. Experiments on the public NUAA-SIRST dataset and IRSTD-1k dataset demonstrate the superiority of the RKformer over state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
港岛妹妹应助太阳雨采纳,获得20
1秒前
2秒前
王能能发布了新的文献求助10
3秒前
11完成签到,获得积分10
3秒前
polarbear发布了新的文献求助10
5秒前
不知名的呆毛给不知名的呆毛的求助进行了留言
5秒前
李健应助威武富采纳,获得10
5秒前
英俊的铭应助安详跳跳糖采纳,获得10
6秒前
6秒前
haha发布了新的文献求助10
8秒前
NPC-CBI发布了新的文献求助30
9秒前
Shutai完成签到,获得积分20
10秒前
11秒前
蔡伟峰完成签到,获得积分10
13秒前
在水一方应助白薯采纳,获得30
14秒前
丘比特应助科研通管家采纳,获得10
16秒前
我是老大应助科研通管家采纳,获得10
17秒前
领导范儿应助科研通管家采纳,获得10
17秒前
科研通AI2S应助科研通管家采纳,获得10
17秒前
田様应助科研通管家采纳,获得10
17秒前
不配.应助科研通管家采纳,获得10
17秒前
17秒前
大林完成签到,获得积分10
18秒前
19秒前
lili发布了新的文献求助48
21秒前
大个应助haha采纳,获得10
22秒前
帅气绮露完成签到,获得积分10
23秒前
23秒前
王能能完成签到,获得积分10
24秒前
Hello应助帅气绮露采纳,获得10
25秒前
xuleiman发布了新的文献求助10
26秒前
干净傲霜完成签到 ,获得积分10
27秒前
彭于晏应助polarbear采纳,获得10
28秒前
曲书文发布了新的文献求助10
29秒前
priss111应助逸风采纳,获得30
31秒前
不配.应助枇杷采纳,获得20
31秒前
33秒前
34秒前
34秒前
高分求助中
The late Devonian Standard Conodont Zonation 2000
歯科矯正学 第7版(或第5版) 1004
Nickel superalloy market size, share, growth, trends, and forecast 2023-2030 1000
Semiconductor Process Reliability in Practice 1000
Smart but Scattered: The Revolutionary Executive Skills Approach to Helping Kids Reach Their Potential (第二版) 1000
Security Awareness: Applying Practical Cybersecurity in Your World 6th Edition 800
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 700
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3241634
求助须知:如何正确求助?哪些是违规求助? 2886125
关于积分的说明 8241828
捐赠科研通 2554651
什么是DOI,文献DOI怎么找? 1382743
科研通“疑难数据库(出版商)”最低求助积分说明 649622
邀请新用户注册赠送积分活动 625295