Optimizing Recurrent Neural Networks: A Study on Gradient Normalization of Weights for Enhanced Training Efficiency

规范化(社会学) 梯度下降 超参数 计算机科学 循环神经网络 困惑 人工智能 人工神经网络 梯度法 随机梯度下降算法 机器学习 算法 语言模型 社会学 人类学
作者
Xinyi Wu,Bingjie Xiang,Huaizheng Lu,Chaopeng Li,Xingwang Huang,Weifang Huang
出处
期刊:Applied sciences [Multidisciplinary Digital Publishing Institute]
卷期号:14 (15): 6578-6578 被引量:2
标识
DOI:10.3390/app14156578
摘要

Recurrent Neural Networks (RNNs) are classical models for processing sequential data, demonstrating excellent performance in tasks such as natural language processing and time series prediction. However, during the training of RNNs, the issues of vanishing and exploding gradients often arise, significantly impacting the model’s performance and efficiency. In this paper, we investigate why RNNs are more prone to gradient problems compared to other common sequential networks. To address this issue and enhance network performance, we propose a method for gradient normalization of network weights. This method suppresses the occurrence of gradient problems by altering the statistical properties of RNN weights, thereby improving training effectiveness. Additionally, we analyze the impact of weight gradient normalization on the probability-distribution characteristics of model weights and validate the sensitivity of this method to hyperparameters such as learning rate. The experimental results demonstrate that gradient normalization enhances the stability of model training and reduces the frequency of gradient issues. On the Penn Treebank dataset, this method achieves a perplexity level of 110.89, representing an 11.48% improvement over conventional gradient descent methods. For prediction lengths of 24 and 96 on the ETTm1 dataset, Mean Absolute Error (MAE) values of 0.778 and 0.592 are attained, respectively, resulting in 3.00% and 6.77% improvement over conventional gradient descent methods. Moreover, selected subsets of the UCR dataset show an increase in accuracy ranging from 0.4% to 6.0%. The gradient normalization method enhances the ability of RNNs to learn from sequential and causal data, thereby holding significant implications for optimizing the training effectiveness of RNN-based models.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
JC发布了新的文献求助10
1秒前
悲伤tomato应助一一采纳,获得10
2秒前
3秒前
3秒前
陈奕雯发布了新的文献求助10
3秒前
fw97完成签到,获得积分10
3秒前
天天快乐应助开朗绿蓉采纳,获得10
4秒前
bell06完成签到,获得积分10
4秒前
4秒前
123456发布了新的文献求助10
6秒前
WF完成签到,获得积分10
6秒前
7秒前
8秒前
顺心小凝发布了新的文献求助10
8秒前
大力的灵雁应助kelexh采纳,获得10
9秒前
静远发布了新的文献求助10
9秒前
淡然又菡发布了新的文献求助10
9秒前
冷静的网络完成签到,获得积分10
9秒前
psychedeng发布了新的文献求助10
9秒前
无奈的碧彤完成签到,获得积分10
9秒前
张宇鑫发布了新的文献求助10
10秒前
11秒前
风筝完成签到,获得积分20
12秒前
PSCs发布了新的文献求助10
12秒前
一只刺豚完成签到,获得积分10
12秒前
直率的飞机关注了科研通微信公众号
12秒前
14秒前
14秒前
14秒前
tyq完成签到,获得积分10
14秒前
kabane完成签到,获得积分10
16秒前
16秒前
李爱国应助无限大门采纳,获得10
16秒前
贪玩的秋柔应助细腻灯泡采纳,获得10
17秒前
18秒前
18秒前
兼善发布了新的文献求助10
19秒前
VV完成签到,获得积分10
19秒前
徐子昂发布了新的文献求助10
20秒前
20秒前
高分求助中
Elements of Propulsion: Gas Turbines and Rockets, Second Edition 1000
卤化钙钛矿人工突触的研究 1000
Engineering for calcareous sediments : proceedings of the International Conference on Calcareous Sediments, Perth 15-18 March 1988 / edited by R.J. Jewell, D.C. Andrews 1000
Wolffs Headache and Other Head Pain 9th Edition 1000
Continuing Syntax 1000
Signals, Systems, and Signal Processing 510
2026 Hospital Accreditation Standards 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6244240
求助须知:如何正确求助?哪些是违规求助? 8067543
关于积分的说明 16840653
捐赠科研通 5321626
什么是DOI,文献DOI怎么找? 2833584
邀请新用户注册赠送积分活动 1811247
关于科研通互助平台的介绍 1667135