残余物
计算机科学
卷积(计算机科学)
人工智能
深度学习
人工神经网络
梯度下降
循环神经网络
算法
块(置换群论)
模式识别(心理学)
数学
几何学
作者
Ziyue Jia,Linfeng Yang,Zhenrong Zhang,Hui Liu,Fannie Kong
标识
DOI:10.1016/j.ijepes.2021.106837
摘要
Non-Intrusive Load Monitoring (NILM) or Energy Disaggregation, seeks to save energy by decomposing corresponding appliances power reading from an aggregate power reading of the whole house. It is regarded as a single channel blind source separation problem to extract sources from a mixed signal. Recent studies have shown that deep learning is widely applied to NILM problem. Theoretically,the ability of any neural network to extract load features is closely related to its depth. However, a deep neural network is difficult to train because of exploding gradient, vanishing gradient, and network degradation. Therefore, Bi-TCN residual block, inspired by a temporal convolution network (TCN), is applied to solve these problems. Causal dilated convolution is replaced by bidirectional(non-causal) dilated convolution to enlarge the receptive field of network and improve the performance of model. Two forms of residual connections are introduced to deep models. One is designed to facilitate training deep models, and the other is pursuing performance-boosting by combining load features extracted of different hierarchical levels to final prediction. We propose a sequence to point learning based on bidirectional dilated convolution for NILM on low-frequency data, called BitcnNILM. We compare our method with existing algorithms on low-frequency data via REDD and UK-DALE datasets. Experiments show that the superiority of our BitcnNILM in both load disaggregation and load on/off identification.
科研通智能强力驱动
Strongly Powered by AbleSci AI