计算机科学
可扩展性
水准点(测量)
块(置换群论)
特征(语言学)
核(代数)
人工智能
蒸馏
领域(数学)
比例(比率)
模式识别(心理学)
数据挖掘
机器学习
数据库
数学
组合数学
物理
哲学
有机化学
化学
量子力学
纯数学
语言学
地理
大地测量学
几何学
作者
Yan-Ting Hu,Yuanfei Huang,Kaibing Zhang
标识
DOI:10.1016/j.knosys.2023.110718
摘要
Efficient image super-resolution (SR), being preferred in the resource-constrained scenarios, aims at not only higher super-resolving accuracy but also lower computational complexity. Taking the perception capability of deep networks into account, efficiently and effectively obtaining the large receptive field is a key principle for this task. Thus, in this paper, we integrate the multi-scale receptive field design with information distillation structure and attention mechanism, and develop a lightweight Multi-Scale Information Distillation (MSID) network. In detail, we design a multi-scale feature distillation (MSFD) block by employing multi-scale convolutions with different kernels into feature distillation connection, which effectively distills information from multiple receptive fields with low computational cost for better feature refinement. Moreover, we construct a scalable large kernel attention (SLKA) block via scaling attentive fields across network layers, that possesses large and scalable receptive field in attention to discriminatively enhance the distilled features. Extensive quantitative and qualitative evaluations on benchmark datasets validate the effectiveness of each proposed component and also demonstrate the superiority of our MSID network over state-of-the-art efficient SR methods. The code is available at https://github.com/YuanfeiHuang/MSID.
科研通智能强力驱动
Strongly Powered by AbleSci AI