卷积(计算机科学)
冗余(工程)
卷积神经网络
计算机科学
可分离空间
计算
特征(语言学)
模式识别(心理学)
人工智能
算法
理论计算机科学
人工神经网络
数学
数学分析
语言学
哲学
操作系统
作者
Yangyang Zhu,Luofeng Xie,Zhenzhen Xie,Min Yin,Guofu Yin
标识
DOI:10.1016/j.patcog.2023.109589
摘要
Because of limited computation resources, convolutional neural networks (CNNs) are difficult to deploy on mobile devices. To overcome this issue, many methods have successively reduced parameters in CNNs with the idea of removing redundancy among feature maps. We observe similarities between feature maps at the same layer but not complete consistency. Intuitively, the difference between similar feature maps is an essential ingredient for the success of CNNs. Therefore, we propose a flexible and separable convolution (FSConv) in a different perspective to embrace redundancy while requiring less computation, which can implicitly cluster feature maps into different clusters without introducing similarity measurements. Our proposed model extracts intrinsic information from the representative part through ordinary convolution in each cluster and reveals tiny hidden details from the redundant part through groupwise/depthwise convolution. Experimental results demonstrate that FSConv-equipped networks always perform better than previous state-of-the-art CNNs compression algorithms. Code is available at https://github.com/Clarkxielf/FSConv-Flexible-and-Separable-Convolution-for-Convolutional-Neural-Networks-Compression.
科研通智能强力驱动
Strongly Powered by AbleSci AI