变压器
电
可靠性工程
电气工程
工程类
汽车工程
计算机科学
环境科学
电压
作者
Stavros Sykiotis,Maria Kaselimi,Anastasios Doulamis,Nikolaos Doulamis
出处
期刊:Sensors
[MDPI AG]
日期:2022-04-11
卷期号:22 (8): 2926-2926
被引量:38
摘要
Non-Intrusive Load Monitoring (NILM) describes the process of inferring the consumption pattern of appliances by only having access to the aggregated household signal. Sequence-to-sequence deep learning models have been firmly established as state-of-the-art approaches for NILM, in an attempt to identify the pattern of the appliance power consumption signal into the aggregated power signal. Exceeding the limitations of recurrent models that have been widely used in sequential modeling, this paper proposes a transformer-based architecture for NILM. Our approach, called ELECTRIcity, utilizes transformer layers to accurately estimate the power signal of domestic appliances by relying entirely on attention mechanisms to extract global dependencies between the aggregate and the domestic appliance signals. Another additive value of the proposed model is that ELECTRIcity works with minimal dataset pre-processing and without requiring data balancing. Furthermore, ELECTRIcity introduces an efficient training routine compared to other traditional transformer-based architectures. According to this routine, ELECTRIcity splits model training into unsupervised pre-training and downstream task fine-tuning, which yields performance increases in both predictive accuracy and training time decrease. Experimental results indicate ELECTRIcity's superiority compared to several state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI