辍学(神经网络)
正规化(语言学)
模式识别(心理学)
过度拟合
卷积(计算机科学)
人工神经网络
深层神经网络
MNIST数据库
判别式
作者
Haibing Wu,Xiaodong Gu
标识
DOI:10.1007/978-3-319-26532-2_6
摘要
Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this insight, we advocate employing our proposed probabilistic weighted pooling, instead of commonly used max-pooling, to act as model averaging at test time. Empirical evidence validates the superiority of probabilistic weighted pooling. We also compare max-pooling dropout and stochastic pooling, both of which introduce stochasticity based on multinomial distributions at pooling stage.
科研通智能强力驱动
Strongly Powered by AbleSci AI