激活函数
人工神经网络
计算机科学
卷积神经网络
人工智能
梯度下降
人工神经网络的类型
深度学习
功能(生物学)
循环神经网络
进化生物学
生物
作者
Ameya D. Jagtap,Yeonjong Shin,Kenji Kawaguchi,George Em Karniadakis
标识
DOI:10.1016/j.neucom.2021.10.036
摘要
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping the number of parameters low. Our theoretical analysis reveals that under suitable conditions, KNNs induce a faster decay of the loss than that by the feed-forward networks. This is also empirically verified through a set of computational examples. Furthermore, under certain technical assumptions, we establish global convergence of gradient descent for KNNs. As a specific case, we propose the Rowdy activation function that is designed to get rid of any saturation region by injecting sinusoidal fluctuations, which include trainable parameters. The proposed Rowdy activation function can be employed in any neural network architecture like feed-forward neural networks, Recurrent neural networks, Convolutional neural networks etc. The effectiveness of KNNs with Rowdy activation is demonstrated through various computational experiments including function approximation using feed-forward neural networks, solution inference of partial differential equations using the physics-informed neural networks, and standard deep learning benchmark problems using convolutional and fully-connected neural networks.
科研通智能强力驱动
Strongly Powered by AbleSci AI