蒸馏
计算机科学
降噪
人工智能
卷积神经网络
图像(数学)
图像去噪
图像压缩
人工神经网络
模式识别(心理学)
机器学习
图像处理
有机化学
化学
作者
Wenshu Chen,Liyuan Peng,Yujie Huang,Minge Jing,Xiaoyang Zeng
标识
DOI:10.1109/asicon52560.2021.9620364
摘要
In recent years, algorithms based on convolutional neural networks (CNNs) have shown great advantages in image denoising. However, the existing state-of-the-art (SOTA) algorithms are too computationally complex to be deployed on embedded devices, like mobile devices. Knowledge distillation is an effective model compression method. However, researches on knowledge distillation are mainly on high-level visual tasks, like image classification, and few on low-level visual tasks, such as image denoising. To solve the above problems, we propose a novel knowledge distillation method for the U-Net based on image denoising algorithms. The experimental results show that the performance of the compressed model is comparable with the original model in the case of quadruple compression.
科研通智能强力驱动
Strongly Powered by AbleSci AI