功能(生物学)
乙状窦函数
激活函数
计算机科学
参数统计
人工智能
对象(语法)
人工神经网络
机器学习
数学
进化生物学
生物
统计
作者
Marina Adriana Mercioni,Ştefan Holban
出处
期刊:International Symposium on Electronics and Telecommunications
日期:2020-11-05
被引量:13
标识
DOI:10.1109/isetc50328.2020.9301059
摘要
In order to improve the performance of a deep neural network, the activation function is an important aspect that we must research continuously, that is why we have expanded the research in this direction. We introduced a novel P-Swish activation function (Parametric Swish), which is able to bring performance improvements on object classification tasks using datasets such as CIFAR-10, CIFAR-100, but we will see that we also used datasets for Natural Language Processing (NLP). To test it, we used several types of architectures, including LeNet-5, Network in Network (NiN), and ResNet34 compared to popular activation functions such as sigmoid, ReLU, Swish, and our proposals. In particular, the P-Swish function facilitates fast network training, which makes it suitable for the Transfer Learning technique.
科研通智能强力驱动
Strongly Powered by AbleSci AI