Karshiev Sanjar,Abdul Rehman,Anand Paul,Jeonghong Kim
标识
DOI:10.1109/icot51877.2020.9468799
摘要
This paper briefly introduces an enhanced neural network regularization method, so called weight dropout, in order to prevent deep neural networks from overfitting. In suggested method, the fully connected layer jointly used with weight dropout is a collection of layers in which the weights between nodes are dropped randomly on the process of training. To accomplish the desired regularization method, we propose a building blocks with our weight dropout mask and CNN. The performance of proposed method has been compared with other previous methods in the domain of image classification and segmentation for the evaluation purpose. The results show that the proposed method gives successful performance accuracies in several datasets.