计算机科学
乙状窦函数
激活函数
趋同(经济学)
人工神经网络
帕斯卡(单位)
一般化
非线性系统
人工智能
机器学习
算法
数学
物理
数学分析
量子力学
经济
经济增长
程序设计语言
作者
Haigen Hu,Aizhu Liu,Guan Qin,Hanwang Qian,Xiaoxin Li,Shengyong Chen,Qianwei Zhou
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-09-01
卷期号:34 (9): 6096-6107
被引量:6
标识
DOI:10.1109/tnnls.2021.3133263
摘要
To enhance the nonlinearity of neural networks and increase their mapping abilities between the inputs and response variables, activation functions play a crucial role to model more complex relationships and patterns in the data. In this work, a novel methodology is proposed to adaptively customize activation functions only by adding very few parameters to the traditional activation functions such as Sigmoid, Tanh, and rectified linear unit (ReLU). To verify the effectiveness of the proposed methodology, some theoretical and experimental analysis on accelerating the convergence and improving the performance is presented, and a series of experiments are conducted based on various network models (such as AlexNet, VggNet, GoogLeNet, ResNet and DenseNet), and various datasets (such as CIFAR10, CIFAR100, miniImageNet, PASCAL VOC, and COCO). To further verify the validity and suitability in various optimization strategies and usage scenarios, some comparison experiments are also implemented among different optimization strategies (such as SGD, Momentum, AdaGrad, AdaDelta, and ADAM) and different recognition tasks such as classification and detection. The results show that the proposed methodology is very simple but with significant performance in convergence speed, precision, and generalization, and it can surpass other popular methods such as ReLU and adaptive functions such as Swish in almost all experiments in terms of overall performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI