过度拟合
辍学(神经网络)
计算机科学
正规化(语言学)
人工智能
人工神经网络
分割
机器学习
模式识别(心理学)
过程(计算)
深层神经网络
操作系统
作者
Karshiev Sanjar,Abdul Rehman,Anand Paul,Jeonghong Kim
标识
DOI:10.1109/icot51877.2020.9468799
摘要
This paper briefly introduces an enhanced neural network regularization method, so called weight dropout, in order to prevent deep neural networks from overfitting. In suggested method, the fully connected layer jointly used with weight dropout is a collection of layers in which the weights between nodes are dropped randomly on the process of training. To accomplish the desired regularization method, we propose a building blocks with our weight dropout mask and CNN. The performance of proposed method has been compared with other previous methods in the domain of image classification and segmentation for the evaluation purpose. The results show that the proposed method gives successful performance accuracies in several datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI