模拟退火
计算机科学
人工神经网络
数学优化
启发式
修剪
GSM演进的增强数据速率
反向传播
计算复杂性理论
算法
人工智能
数学
农学
生物
作者
Chun Lin Kuo,Erçan E. Kuruoğlu,Wai Kin Chan
出处
期刊:Entropy
[MDPI AG]
日期:2022-02-28
卷期号:24 (3): 348-348
被引量:9
摘要
A critical problem in large neural networks is over parameterization with a large number of weight parameters, which limits their use on edge devices due to prohibitive computational power and memory/storage requirements. To make neural networks more practical on edge devices and real-time industrial applications, they need to be compressed in advance. Since edge devices cannot train or access trained networks when internet resources are scarce, the preloading of smaller networks is essential. Various works in the literature have shown that the redundant branches can be pruned strategically in a fully connected network without sacrificing the performance significantly. However, majority of these methodologies need high computational resources to integrate weight training via the back-propagation algorithm during the process of network compression. In this work, we draw attention to the optimization of the network structure for preserving performance despite compression by pruning aggressively. The structure optimization is performed using the simulated annealing algorithm only, without utilizing back-propagation for branch weight training. Being a heuristic-based, non-convex optimization method, simulated annealing provides a globally near-optimal solution to this NP-hard problem for a given percentage of branch pruning. Our simulation results have shown that simulated annealing can significantly reduce the complexity of a fully connected network while maintaining the performance without the help of back-propagation.
科研通智能强力驱动
Strongly Powered by AbleSci AI