计算机科学
杠杆(统计)
时间序列
短时记忆
深度学习
延迟(音频)
循环神经网络
过程(计算)
人工智能
障碍物
人工神经网络
领域(数学)
系列(地层学)
机器学习
电信
操作系统
生物
古生物学
数学
法学
纯数学
政治学
作者
Yuxiu Hua,Zhifeng Zhao,Rongpeng Li,Xianfu Chen,Zhiming Liu,Honggang Zhang
出处
期刊:IEEE Communications Magazine
[Institute of Electrical and Electronics Engineers]
日期:2019-03-08
卷期号:57 (6): 114-119
被引量:442
标识
DOI:10.1109/mcom.2019.1800155
摘要
Time series prediction can be generalized as a process that extracts useful information from historical records and then determines future values. Learning long-range dependencies that are embedded in time series is often an obstacle for most algorithms, whereas LSTM solutions, as a specific kind of scheme in deep learning, promise to effectively overcome the problem. In this article, we first give a brief introduction to the structure and forward propagation mechanism of LSTM. Then, aiming at reducing the considerable computing cost of LSTM, we put forward a RCLSTM model by introducing stochastic connectivity to conventional LSTM neurons. Therefore, RCLSTM exhibits a certain level of sparsity and leads to a decrease in computational complexity. In the field of telecommunication networks, the prediction of traffic and user mobility could directly benefit from this improvement as we leverage a realistic dataset to show that for RCLSTM, the prediction performance comparable to LSTM is available, whereas considerably less computing time is required. We strongly argue that RCLSTM is more competent than LSTM in latency-stringent or power-constrained application scenarios.
科研通智能强力驱动
Strongly Powered by AbleSci AI