超参数
计算机科学
随机梯度下降算法
噪音(视频)
人工神经网络
梯度下降
随机共振
直线(几何图形)
功能(生物学)
人工智能
算法
数学优化
图像(数学)
数学
生物
进化生物学
几何学
作者
Weijin Li,Yuanqiang Ren,Fabing Duan
出处
期刊:Chinese Physics B
[IOP Publishing]
日期:2022-07-01
卷期号:31 (8): 080503-080503
被引量:3
标识
DOI:10.1088/1674-1056/ac5886
摘要
Aiming at training the feed-forward threshold neural network consisting of nondifferentiable activation functions, the approach of noise injection forms a stochastic resonance based threshold network that can be optimized by various gradient-based optimizers. The introduction of injected noise extends the noise level into the parameter space of the designed threshold network, but leads to a highly non-convex optimization landscape of the loss function. Thus, the hyperparameter on-line learning procedure with respective to network weights and noise levels becomes of challenge. It is shown that the Adam optimizer, as an adaptive variant of stochastic gradient descent, manifests its superior learning ability in training the stochastic resonance based threshold network effectively. Experimental results demonstrate the significant improvement of performance of the designed threshold network trained by the Adam optimizer for function approximation and image classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI