计算机科学
修剪
卷积神经网络
人工智能
计算
推论
模式识别(心理学)
算法
农学
生物
作者
Yiheng Lu,Ziyu Guan,Wei Zhao,Maoguo Gong,Wenxiao Wang,Kai Sheng
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-09-13
卷期号:11 (4): 6972-6991
标识
DOI:10.1109/jiot.2023.3314820
摘要
Convolutional neural networks (CNNs) are used comprehensively in the field of the Internet of Things (IoTs), such as mobile phones, surveillance, and satellite. However, the deployment of CNNs is difficult because the structure of hand-designed networks is complicated. Therefore, we propose a sensitiveness based network pruning framework (SNPF) to reduce the size of original networks to save computation resources. SNPF will evaluate the importance of each convolutional layer by the reconstruction of inference accuracy when we add extra noise to the original model, and then remove filters in terms of the degree of sensitiveness for each layer. Compared with previous weight-norm based pruning methods such as “l1-norm”“, BatchNorm-Pruning”, and “Taylor-Pruning”, SNPF is robust to the update of parameters, which can avoid the inconsistency of evaluation for filters if the parameters of the pre-trained model are not fully optimized. Namely, SNPF can prune the network at the early training stage to save computation resources. We test our method on three prevalent models of VGG-16, ResNet-18, ResNet-50 and a customized Conv-4 with 4 convolutional layers. They are then tested on CIFAR-10, CIFAR-100, ImageNet, and MNIST, respectively. Impressively, we observe that even when the VGG-16 is only trained with 50 epochs, we can get the same evaluation of layer importance as the results when the model is fully trained. Additionally, we can also achieve comparable pruning results to previous weight-oriented methods on the other three models.
科研通智能强力驱动
Strongly Powered by AbleSci AI