修剪
计算机科学
图像压缩
压缩(物理)
图像(数学)
数据压缩
人工智能
计算机视觉
图像处理
园艺
材料科学
复合材料
生物
作者
A-Li Luo,Houjun Sun,Jinming Liu,Fangzheng Lin,Jiro Katto
标识
DOI:10.1109/vcip59821.2023.10402683
摘要
Learned Image Compression (LIC), which uses neural networks to compress images, has experienced significant growth in recent years. The hyperprior-module-based LIC model has achieved higher performance than classical codecs. However, the LIC models are too heavy (in calculation and parameter amounts) to apply to edge devices. To solve this problem, some former papers focus on structural pruning for LIC models. However, they either cause noticeable performance decrement or neglect the appropriate pruning threshold for each LIC model. These problems keep their pruning results sub-optimal. This paper proposes a Pruning Threshold Searching on the hyperprior module for different-quality LIC models. Our method removes most parameters and calculations while keeping the performance the same as the models before pruning. We removed at least 49.8% of parameters and 28.5% of calculations for the Channel-Wise-Context-Model-based models and 29.1% of parameters for the Cheng-2020 models.
科研通智能强力驱动
Strongly Powered by AbleSci AI