RKformer: Runge-Kutta Transformer with Random-Connection Attention for Infrared Small Target Detection

计算机科学 人工智能 编码器 像素 地点 变压器 模式识别(心理学) 算法 计算机视觉 理论计算机科学 哲学 语言学 物理 量子力学 电压 操作系统
作者
Mingjin Zhang,Haichen Bai,Jing Zhang,Rui Zhang,Chaoyue Wang,Jie Guo,Xinbo Gao
标识
DOI:10.1145/3503161.3547817
摘要

Infrared small target detection (IRSTD) refers to segmenting the small targets from infrared images, which is of great significance in practical applications. However, due to the small scale of targets as well as noise and clutter in the background, current deep neural network-based methods struggle in extracting features with discriminative semantics while preserving fine details. In this paper, we address this problem by proposing a novel RKformer model with an encoder-decoder structure, where four specifically designed Runge-Kutta transformer (RKT) blocks are stacked sequentially in the encoder. Technically, it has three key designs. First, we adopt a parallel encoder block (PEB) of the transformer and convolution to take their advantages in long-range dependency modeling and locality modeling for extracting semantics and preserving details. Second, we propose a novel random-connection attention (RCA) block, which has a reservoir structure to learn sparse attention via random connections during training. RCA encourages the target to attend to sparse relevant positions instead of all the large-area background pixels, resulting in more informative attention scores. It has fewer parameters and computations than the original self-attention in the transformer while performing better. Third, inspired by neural ordinary differential equations (ODE), we stack two PEBs with several residual connections as the basic encoder block to implement the Runge-Kutta method for solving ODE, which can effectively enhance the feature and suppress noise. Experiments on the public NUAA-SIRST dataset and IRSTD-1k dataset demonstrate the superiority of the RKformer over state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
2秒前
3秒前
mxnklimt完成签到,获得积分10
4秒前
5秒前
搜集达人应助勤恳的画笔采纳,获得10
5秒前
YYH发布了新的文献求助10
6秒前
gmy发布了新的文献求助10
6秒前
霸气的仙人掌完成签到 ,获得积分10
8秒前
9秒前
10秒前
10秒前
Ricewind完成签到,获得积分10
11秒前
11秒前
JamesPei应助慈祥的夏岚采纳,获得10
12秒前
yuanbao发布了新的文献求助10
12秒前
乌哩咕噜发布了新的文献求助10
13秒前
YYH完成签到,获得积分10
13秒前
14秒前
大志发布了新的文献求助10
14秒前
14秒前
金海完成签到 ,获得积分10
15秒前
YH发布了新的文献求助10
18秒前
19秒前
19秒前
我是老大应助Gotye0829采纳,获得10
20秒前
21秒前
Yichen完成签到,获得积分10
23秒前
511完成签到 ,获得积分10
25秒前
沉眠猫猫虫完成签到,获得积分10
25秒前
Hello应助YH采纳,获得10
25秒前
gmy完成签到,获得积分10
25秒前
爆米花应助yuanbao采纳,获得10
25秒前
25秒前
Jan发布了新的文献求助150
26秒前
27秒前
Owen完成签到,获得积分10
27秒前
璐璇完成签到,获得积分10
28秒前
青云阁完成签到,获得积分20
29秒前
29秒前
30秒前
高分求助中
Encyclopedia of Quaternary Science Third edition 2025 12000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Social Work Ethics Casebook: Cases and Commentary (revised 2nd ed.). Frederic G. Reamer 800
Beyond the sentence : discourse and sentential form / edited by Jessica R. Wirth 600
Holistic Discourse Analysis 600
Vertébrés continentaux du Crétacé supérieur de Provence (Sud-Est de la France) 600
Reliability Monitoring Program 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5339366
求助须知:如何正确求助?哪些是违规求助? 4476236
关于积分的说明 13930768
捐赠科研通 4371637
什么是DOI,文献DOI怎么找? 2402047
邀请新用户注册赠送积分活动 1394975
关于科研通互助平台的介绍 1366898