作者
Guannan Li,Xiaowei Zhao,Cheng Fan,Xi Fang,Fan Li,Yubei Wu
摘要
Given the need for timely and reliable management of power distribution systems and smart grids, it is of great significance to develop a quick and accurate short-term building energy prediction model. Currently, the deep learning method, i.e., long short-term memory network (LSTM), is widely used for short-term building energy prediction. To further enhance the prediction accuracy and reduce the computational cost, previous studies have investigated improved LSTM models with modified structures such as LSTM-Attention, and LSTM-CNN. However, there is a lack of systematic assessment of these LSTM-based building energy forecast models considering the influencing factors such as model parameters tuning, modelling data volume, building type, climate features. Further, there is a lack of research on the combination of LSTM together with Attention and convolutional neural network (CNN) modifications. To address these research gaps, comparative evaluations of pure LSTM and five improved LSTM models (i.e., LSTM-CNN, CNN-LSTM, LSTM-Attention, CNN-Attention-LSTM, and LSTM-Attention-CNN) were performed in this study. These models were validated using the open-source data sets from the Building Data Genome Project 2. Comparative studies were conducted on 60 randomly selected buildings from four different climate zones consisting of six different building types; evaluations were performed using either one-year or two-year energy consumption data. Further, the prediction performance of these models after parameter tuning was assessed in terms of prediction accuracy and computational time. The results demonstrated that, after parameter optimisation, LSTM models exhibited reduced root mean square error (RMSE) by 6.2%–29.2%. When only one-year data were used for modeling, CNN-LSTM decreased the average RMSEs of LSTM by as much as 2.9%. When two-year data were used for modelling, LSTM-ATT exhibited more stable prediction performance than the other models and decreased the average RMSE of LSTM by 5.6% at most.