分辨率(逻辑)
二进制数
图像(数学)
计算机科学
人工智能
计算机视觉
数学
算术
作者
Jingwei Xin,Nannan Wang,Xinrui Jiang,Jie Li,Xiaoyu Wang,Xinbo Gao
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-15
标识
DOI:10.1109/tnnls.2024.3438432
摘要
Binary neural network (BNN) is an effective approach to reduce the memory usage and the computational complexity of full-precision convolutional neural networks (CNNs), which has been widely used in the field of deep learning. However, there are different properties between BNNs and real-valued models, making it difficult to draw on the experience of CNN composition to develop BNN. In this article, we study the application of binary network to the single-image super-resolution (SISR) task in which the network is trained for restoring original high-resolution (HR) images. Generally, the distribution of features in the network for SISR is more complex than those in recognition models for preserving the abundant image information, e.g., texture, color, and details. To enhance the representation ability of BNN, we explore a novel activation-rectified inference (ARI) module that achieves a more complete representation of features by combining observations from different quantitative perspectives. The activations are divided into several parts with different quantification intervals and are inferred independently. This allows the binary activations to retain more image detail and yield finer inference. In addition, we further propose an adaptive approximation estimator (AAE) for gradually learning the accurate gradient estimation interval in each layer to alleviate the optimization difficulty. Experiments conducted on several benchmarks show that our approach is able to learn a binary SISR model with superior performance over the state-of-the-art methods. The code will be released at https://github.com/jwxintt/Rectified-BSR.
科研通智能强力驱动
Strongly Powered by AbleSci AI