计算机科学
卷积神经网络
人工智能
人工神经网络
图像(数学)
知识转移
蒸馏
深度学习
机器学习
计算机视觉
知识管理
有机化学
化学
作者
Ziwen Li,Yuehuan Wang,Jinpu Zhang
标识
DOI:10.1016/j.neucom.2022.10.083
摘要
Low-light image enhancement studies how to improve the quality of images captured under poor lighting conditions, which is of real-world importance. Currently, convolutional neural network (CNN)-based methods with state-of-the-art performance have become the mainstream of research. However, most CNN-based methods improve the performance of the algorithm by increasing the width and depth of the neural network, which requires large computing device resources. In this paper, we propose a knowledge distillation method for low light image enhancement. The proposed method uses a teacher-student framework in which the teacher network tries to transfer the rich knowledge to the student network. The student network learns the knowledge of image enhancement under the supervision of ground truth images and under the guidance of the teacher network simultaneously. Knowledge transfer between the teacher-student network is accomplished by distillation loss based on attention maps. We designed a gradient-guided low-light image enhancement network that can be divided into an enhancement branch and a gradient branch, where the enhancement branch is learned under the guidance of the gradient branch to better preserve structural information. The teacher and student networks use a similar structure, but they have different model sizes. The teacher network has more parameters and more powerful learning capabilities than the student network. With the help of knowledge distillation, our approach can improve the performance of the student network without increasing the computational burden during the testing phase. The qualitative and quantitative experimental results demonstrate the superiority of our method compared to the state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI