计算机科学
初始化
残差神经网络
卷积神经网络
残余物
人工智能
面子(社会学概念)
集合(抽象数据类型)
模式识别(心理学)
理论(学习稳定性)
任务(项目管理)
面部识别系统
激活函数
训练集
缩放比例
深度学习
过程(计算)
人工神经网络
机器学习
算法
数学
几何学
社会学
操作系统
社会科学
经济
管理
程序设计语言
作者
Shuping Peng,Hongbo Huang,Weijun Chen,Liang Zhang,Weiwei Fang
标识
DOI:10.1016/j.neucom.2020.05.022
摘要
In recent years, applications of face recognition have increased significantly. Despite the successful application of deep convolutional neural network (DCNN), training such networks is still a challenging task that needs a lot of experience and carefully tuning. Based on the Inception-ResNet network, we propose a novel method to mitigate the difficulty of training such deep convolutional neural network and improve its performance simultaneously. The residual scaling factor used in the Inception-ResNet module is a manually set fixed value. We believe that changing the value to a trainable parameter and initializing it to a small value can improve the stability of the model training. We further adopted a small trick of alternating the ReLU activation function with the Leaky ReLU and PReLU. The proposed model slightly increased the number of training parameters but improved training stability and performance significantly. Extensive experiments are conducted on VGGFace2, MS1MV2, IJBB and LFW datasets. The results show that the proposed trainable residual scaling factor (TRSF) and PReLU can promote the accuracy notably while stabilizing training process.
科研通智能强力驱动
Strongly Powered by AbleSci AI