Sparse self-attention transformer for image inpainting

修补 变压器 计算机科学 人工智能 计算机视觉 模式识别(心理学) 图像(数学) 工程类 电压 电气工程
作者
Wenli Huang,Ye Deng,S. Hui,Yang Wu,Sanping Zhou,Jinjun Wang
出处
期刊:Pattern Recognition [Elsevier BV]
卷期号:145: 109897-109897 被引量:42
标识
DOI:10.1016/j.patcog.2023.109897
摘要

Learning-based image inpainting methods have made remarkable progress in recent years. Nevertheless, these methods still suffer from issues such as blurring, artifacts, and inconsistent contents. The use of vanilla convolution kernels, which have limited perceptual fields and spatially invariant kernel coefficients, is one of the main causes for these problems. In contrast, the multi-headed attention in the transformer can effectively model non-local relations among input features by generating adaptive attention scores. Therfore, this paper explores the feasibility of employing the transformer model for the image inpainting task. However, the multi-headed attention transformer blocks pose a significant challenge due to their overwhelming computational cost. To address this issue, we propose a novel U-Net style transformer-based network for the inpainting task, called the sparse self-attention transformer (Spa-former). The Spa-former retains the long-range modeling capacity of transformer blocks while reducing the computational burden. It incorporates a new channel attention approximation algorithm that reduces attention calculation to linear complexity. Additionally, it replaces the canonical softmax function with the ReLU function to generate a sparse attention map that effectively excludes irrelevant features. As a result, the Spa-former achieves effective long-range feature modeling with fewer parameters and lower computational resources. Our empirical results on challenging benchmarks demonstrate the superior performance of our proposed Spa-former over state-of-the-art approaches.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
aa发布了新的文献求助10
2秒前
3秒前
研友_ZGXj48完成签到,获得积分10
3秒前
兴奋大船完成签到,获得积分10
4秒前
丘比特应助压线大王采纳,获得10
4秒前
4秒前
4秒前
俞孤风发布了新的文献求助10
5秒前
SYY发布了新的文献求助20
5秒前
香蕉八宝粥完成签到,获得积分10
5秒前
科研通AI5应助胡胡采纳,获得10
5秒前
6秒前
7秒前
科研顺荔发布了新的文献求助10
8秒前
咸鱼发布了新的文献求助10
8秒前
9秒前
香蕉觅云应助羊羊羊采纳,获得10
9秒前
尘间雪发布了新的文献求助10
9秒前
10秒前
曾经厉完成签到,获得积分10
10秒前
11秒前
一二发布了新的文献求助10
11秒前
科研的牲口完成签到,获得积分10
11秒前
yueyueyahoo发布了新的文献求助10
12秒前
科研通AI5应助hyekyo采纳,获得30
12秒前
12秒前
哈哈哈发布了新的文献求助10
12秒前
123完成签到,获得积分10
13秒前
13秒前
13秒前
闲闲发布了新的文献求助10
13秒前
15秒前
在水一方应助zichun采纳,获得10
16秒前
16秒前
17秒前
丁真真发布了新的文献求助10
17秒前
CodeCraft应助F123456采纳,获得10
17秒前
Rinohalt完成签到,获得积分10
17秒前
18秒前
高分求助中
All the Birds of the World 2000
IZELTABART TAPATANSINE 500
GNSS Applications in Earth and Space Observations 300
Handbook of Laboratory Animal Science 300
Not Equal : Towards an International Law of Finance 260
A method for calculating the flow in a centrifugal impeller when entropy gradients are present 240
Dynamics in Chinese Digital Commons: Law, Technology, and Governance 220
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3718103
求助须知:如何正确求助?哪些是违规求助? 3264855
关于积分的说明 9936194
捐赠科研通 2978614
什么是DOI,文献DOI怎么找? 1633539
邀请新用户注册赠送积分活动 775172
科研通“疑难数据库(出版商)”最低求助积分说明 745454