激活函数
计算机科学
人工智能
人工神经网络
参数统计
梯度下降
任务(项目管理)
功能(生物学)
深度学习
过程(计算)
机器学习
数学
统计
管理
进化生物学
经济
生物
操作系统
作者
Garrett Bingham,Risto Miikkulainen
标识
DOI:10.1016/j.neunet.2022.01.001
摘要
Recent studies have shown that the choice of activation function can significantly affect the performance of deep learning networks. However, the benefits of novel activation functions have been inconsistent and task dependent, and therefore the rectified linear unit (ReLU) is still the most commonly used. This paper proposes a technique for customizing activation functions automatically, resulting in reliable improvements in performance. Evolutionary search is used to discover the general form of the function, and gradient descent to optimize its parameters for different parts of the network and over the learning process. Experiments with four different neural network architectures on the CIFAR-10 and CIFAR-100 image classification datasets show that this approach is effective. It discovers both general activation functions and specialized functions for different architectures, consistently improving accuracy over ReLU and other activation functions by significant margins. The approach can therefore be used as an automated optimization step in applying deep learning to new tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI