RKformer: Runge-Kutta Transformer with Random-Connection Attention for Infrared Small Target Detection

计算机科学 人工智能 编码器 像素 地点 变压器 模式识别(心理学) 算法 计算机视觉 理论计算机科学 哲学 语言学 物理 量子力学 电压 操作系统
作者
Mingjin Zhang,Haichen Bai,Jing Zhang,Rui Zhang,Chaoyue Wang,Jie Guo,Xinbo Gao
标识
DOI:10.1145/3503161.3547817
摘要

Infrared small target detection (IRSTD) refers to segmenting the small targets from infrared images, which is of great significance in practical applications. However, due to the small scale of targets as well as noise and clutter in the background, current deep neural network-based methods struggle in extracting features with discriminative semantics while preserving fine details. In this paper, we address this problem by proposing a novel RKformer model with an encoder-decoder structure, where four specifically designed Runge-Kutta transformer (RKT) blocks are stacked sequentially in the encoder. Technically, it has three key designs. First, we adopt a parallel encoder block (PEB) of the transformer and convolution to take their advantages in long-range dependency modeling and locality modeling for extracting semantics and preserving details. Second, we propose a novel random-connection attention (RCA) block, which has a reservoir structure to learn sparse attention via random connections during training. RCA encourages the target to attend to sparse relevant positions instead of all the large-area background pixels, resulting in more informative attention scores. It has fewer parameters and computations than the original self-attention in the transformer while performing better. Third, inspired by neural ordinary differential equations (ODE), we stack two PEBs with several residual connections as the basic encoder block to implement the Runge-Kutta method for solving ODE, which can effectively enhance the feature and suppress noise. Experiments on the public NUAA-SIRST dataset and IRSTD-1k dataset demonstrate the superiority of the RKformer over state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
温伊完成签到,获得积分10
刚刚
1秒前
斜玉完成签到,获得积分10
1秒前
1秒前
1秒前
一独白发布了新的文献求助30
1秒前
木子小样完成签到,获得积分10
2秒前
斯文败类应助Cc采纳,获得10
4秒前
龙科完成签到,获得积分10
4秒前
kelaibing完成签到,获得积分10
4秒前
白云发布了新的文献求助10
4秒前
JokerLove发布了新的文献求助10
4秒前
4秒前
5秒前
大罗完成签到 ,获得积分10
5秒前
7秒前
7秒前
jiang发布了新的文献求助30
7秒前
绿蚁新醅酒呀完成签到,获得积分10
8秒前
9秒前
paperx发布了新的文献求助10
9秒前
上官雨时发布了新的文献求助30
9秒前
蔺天宇完成签到,获得积分10
9秒前
邵丹完成签到 ,获得积分20
9秒前
10秒前
abaaba完成签到,获得积分10
10秒前
Young完成签到 ,获得积分10
11秒前
努力的火龙果完成签到,获得积分10
11秒前
11秒前
科目三应助惔惔惔采纳,获得10
12秒前
petrichor发布了新的文献求助10
15秒前
Esther完成签到,获得积分20
15秒前
想杀鸡给想杀鸡的求助进行了留言
16秒前
16秒前
炫饭大王发布了新的文献求助30
17秒前
脑洞疼应助早睡早起采纳,获得10
18秒前
科研天天干完成签到,获得积分10
19秒前
研友_VZG7GZ应助一独白采纳,获得10
19秒前
lalahei完成签到,获得积分10
19秒前
李明完成签到,获得积分10
19秒前
高分求助中
Pipeline and riser loss of containment 2001 - 2020 (PARLOC 2020) 1000
哈工大泛函分析教案课件、“72小时速成泛函分析:从入门到入土.PDF”等 660
Theory of Dislocations (3rd ed.) 500
Comparing natural with chemical additive production 500
The Leucovorin Guide for Parents: Understanding Autism’s Folate 500
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 500
A Manual for the Identification of Plant Seeds and Fruits : Second revised edition 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 5213148
求助须知:如何正确求助?哪些是违规求助? 4389063
关于积分的说明 13665899
捐赠科研通 4250024
什么是DOI,文献DOI怎么找? 2331888
邀请新用户注册赠送积分活动 1329543
关于科研通互助平台的介绍 1283086