计算机科学
人工智能
机器学习
一般化
人工神经网络
时间序列
组分(热力学)
循环神经网络
短时记忆
深度学习
系列(地层学)
序列(生物学)
数学分析
古生物学
物理
数学
生物
热力学
遗传学
作者
Hemant Yadav,Amit Thakkar
标识
DOI:10.1016/j.eswa.2023.122333
摘要
The application of Machine learning and deep learning techniques for time series forecasting has gained significant attention in recent years. Numerous endeavors have been devoted to automating forecasting through the utilization of cutting-edge neural networks. Notably, the recurrent neural network (LSTM – Long Short-Term Memory) has emerged as a central concept in most research endeavors. Although LSTM was initially introduced in 1997 for sequence modeling, subsequent updates have primarily focused on language learning tasks. These updates have introduced various computational mechanisms within the LSTM cell, including the forget gate, input gate, and output gate. In this study, we investigate the impact of each computational component in isolation to analyze their effects on time series forecasting tasks. Our experiments utilize the Jena weather dataset and Appliance Energy Usage time series for evaluation. The experimental results reveal that variations of the LSTM model outperform the most popular LSRM cell format in terms of error rate and training time. Specifically, the variations identified in this paper demonstrate superior generalization capabilities and yield reduced forecasting errors.
科研通智能强力驱动
Strongly Powered by AbleSci AI