计算机科学
人工智能
模式识别(心理学)
残余物
卷积神经网络
领域(数学)
保险丝(电气)
门控
残差神经网络
特征(语言学)
算法
数学
哲学
工程类
电气工程
生物
生理学
纯数学
语言学
作者
Jun Miao,Shaowu Xu,Baixian Zou,Yuanhua Qiao
标识
DOI:10.1007/s11042-021-10802-6
摘要
CNN(Convolutional Neural Networks) is a hot topic in the field of pattern recognition., especially in the field of image recognition. And ResNet(Residual Networks) is a special kind of CNN. Compared with the general CNN structure, ResNet introduces the residual unit with an identity mapping. Identity mapping allows the deep layers to directly learn the data received by the shallow layers, which reduces the difficulty of network convergence to a certain extent. As a result, ResNet has a better learning ability, has achieved good performance in various types of image recognition work. The essence of the residual network is to fuse two types of features from different receptive fields, using the fused features instead of the output features of the previous layer as the learning object. But the implementation of feature fusion in original ResNet is adding the two features with equal weights. And this method ignores the fact that the contribution of features from different levels to the learning of the network may not be the same. In this paper, we introduce a feature-inspired gating strategy in the residual unit of ResNet, which allows the network giving different weights to different features, so that the implementation of the feature fusion can be transformed from adding features with equal weights into weighted summation with different weights. And through experiments, we proved that ResNet with gating strategy proposed in this paper can obtain higher recognition accuracy than original ResNet.
科研通智能强力驱动
Strongly Powered by AbleSci AI