计算机科学
变压器
算法
计算
图像分辨率
图像质量
人工智能
计算机工程
理论计算机科学
图像(数学)
电压
物理
量子力学
作者
Xiaoqiang Zhou,Huaibo Huang,Zilei Wang,Ran He
标识
DOI:10.1109/tmm.2024.3352400
摘要
Many recent image restoration methods use Transformer as the backbone network and redesign the Transformer blocks. Differently, we explore the parameter-sharing mechanism over Transformer blocks and propose a dynamic recursive process to address the image super-resolution task efficiently. We firstly present a Recursive Image Super-resolution Transformer (RIST). By sharing the weights across different blocks, a plain forward process through the whole Transformer network can be folded into recursive iterations through a Transformer block. Such a parameter-sharing based recursive process can not only reduce the model size greatly, but also enable restoring images progressively. Features in the recursive process are modeled as a sequence and propagated with a temporal attention network. Besides, by analyzing the prediction variation across different iterations in RIST, we design a dynamic recursive process that can allocate adaptive computation costs to different samples. Specifically, a quality assessment network estimates the restoration quality and terminates the recursive process dynamically. We propose a relativistic learning strategy to simplify the objective from absolute image quality assessment to relativistic quality comparison. The proposed Recursive Image Super-resolution Transformer with Relativistic Assessment (RISTRA) reduces the model size greatly with the parameter-sharing mechanism, and achieves an instance-wise dynamic restoration process as well. Extensive experiments on several image super-resolution benchmarks show the superiority of our approach over state-of-the-art counterparts
科研通智能强力驱动
Strongly Powered by AbleSci AI