FVW: Finding Valuable Weight on Deep Neural Network for Model Pruning

计算机科学 深度学习 修剪 人工智能 人工神经网络 机器学习 过程(计算) 软件部署 一致性(知识库) 深层神经网络 推论 软件工程 农学 生物 操作系统
作者
Zhiyu Zhu,Huaming Chen,Zhibo Jin,Xinyi Wang,J. Z. Zhang,Minhui Xue,Qinghua Lu,Jun Shen,Kim‐Kwang Raymond Choo
标识
DOI:10.1145/3583780.3614889
摘要

The rapid development of deep learning has demonstrated its potential for deployment in many intelligent service systems. However, some issues such as optimisation (e.g., how to reduce the deployment resources costs and further improve the detection speed), especially in scenarios where limited resources are available, remain challenging to address. In this paper, we aim to delve into the principles of deep neural networks, focusing on the importance of network neurons. The goal is to identify the neurons that exert minimal impact on model performances, thereby aiding in the process of model pruning. In this work, we have thoroughly considered the deep learning model pruning process with and without fine-tuning step, ensuring the model performance consistency. To achieve our objectives, we propose a methodology that employs adversarial attack methods to explore deep neural network parameters. This approach is combined with an innovative attribution algorithm to analyse the level of network neurons involvement. In our experiments, our approach can effectively quantify the importance of network neuron. We extend the evaluation through comprehensive experiments conducted on a range of datasets, including CIFAR-10, CIFAR-100 and Caltech101. The results demonstrate that, our method have consistently achieved the state-of-the-art performance over many existing methods. We anticipate that this work will help to reduce the heavy training and inference cost of deep neural network models where a lightweight deep learning enhanced service and system is possible. The source code is open source at https://github.com/LMBTough/FVW.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
共渡发布了新的文献求助10
刚刚
852应助Hh采纳,获得10
1秒前
Xwu发布了新的文献求助10
1秒前
科研通AI2S应助zxvcbnm采纳,获得10
1秒前
Always完成签到,获得积分10
2秒前
2秒前
zzz发布了新的文献求助30
3秒前
3秒前
混子发布了新的文献求助10
3秒前
盼盼完成签到,获得积分10
5秒前
6秒前
7秒前
8秒前
穆紫应助kk采纳,获得10
8秒前
搜集达人应助zzz采纳,获得30
8秒前
研友_VZG7GZ应助阿三采纳,获得10
9秒前
盼盼发布了新的文献求助10
10秒前
虚拟的姒发布了新的文献求助20
10秒前
爆米花应助吃人陈采纳,获得10
11秒前
zxvcbnm发布了新的文献求助10
11秒前
缥缈伟祺完成签到,获得积分20
12秒前
12秒前
杀手爱吃小熊饼干完成签到,获得积分20
12秒前
DianaRang发布了新的文献求助10
13秒前
13秒前
13秒前
初级小白发布了新的文献求助10
14秒前
15秒前
15秒前
15秒前
16秒前
月光取暖发布了新的文献求助10
17秒前
狂野元枫发布了新的文献求助10
17秒前
huo发布了新的文献求助10
17秒前
ningning完成签到 ,获得积分10
17秒前
zzz完成签到,获得积分20
17秒前
鹿不羁完成签到 ,获得积分10
17秒前
whz完成签到,获得积分10
18秒前
18秒前
李健应助橙子采纳,获得10
19秒前
高分求助中
Sustainability in Tides Chemistry 2000
Bayesian Models of Cognition:Reverse Engineering the Mind 800
Essentials of thematic analysis 700
A Dissection Guide & Atlas to the Rabbit 600
Very-high-order BVD Schemes Using β-variable THINC Method 568
Mantiden: Faszinierende Lauerjäger Faszinierende Lauerjäger 500
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3124803
求助须知:如何正确求助?哪些是违规求助? 2775148
关于积分的说明 7725553
捐赠科研通 2430633
什么是DOI,文献DOI怎么找? 1291291
科研通“疑难数据库(出版商)”最低求助积分说明 622121
版权声明 600328