序列(生物学)
期限(时间)
能量(信号处理)
计算机科学
工程类
物理
数学
生物
统计
遗传学
量子力学
作者
Guannan Li,Fan Li,Tanveer Ahmad,Jiangyan Liu,Li Tao,Xi Fang,Yubei Wu
出处
期刊:Energy
[Elsevier]
日期:2022-11-01
卷期号:259: 124915-124915
被引量:2
标识
DOI:10.1016/j.energy.2022.124915
摘要
Traditional building energy prediction(BEP) methods usually solve time-series prediction problems using either recursive strategy or direct strategy, which may ignore time-dependence between continuous building energy data in building energy systems. To overcome this issue, a sequence-to-sequence(Seq2seq) model combined with attention mechanism(Seq2seq-Att) is developed to realize multi-step ahead BEP. Compared with the original Seq2seq, both parameter-tuning and attention mechanism in the Seq2seq-Att model have great impacts on BEP performance improvement. To obtain quantitative analyses of performance improvement of these two aspects, this study conducted a comprehensive performance evaluation of four Seq2seq models (i.e., before and after parameter-tuning, adding attention and without attention). In this study, the length of sliding window is 24-h and prediction time steps ranges from 1-h to 12-h ahead. From the open-source Building Data Genome Project 2 , 36 buildings are selected. Results indicate that adding attention to Seq2seq together with parameter-tuning, the multi-step ahead prediction performance can be increased by 8%(parameter-tuning around 6% while adding attention about 2%) on average. For prediction time step less than 3-h, parameter-tuning is a convenient way to improve the Seq2seq-based multi-step ahead BEP model. But for cases of prediction time step over 3-h, combining attention to the Seq2seq after parameter-tuning is recommended. • Evaluate Seq2seq and Attention on 36 buildings for multi-step short-term energy predictions. • Enhance Seq2seq multi-step prediction R 2 averagely by 8% (attention 2%, parameter-tuning 6%). • arameter-tuning is enough to enhance Seq2seq multi-step prediction for time-step<3-h ahead. • Recommend adding attention to Seq2seq after parameter-tuning for time-step>3-h ahead.
科研通智能强力驱动
Strongly Powered by AbleSci AI