A Novel Hybrid Deep Learning Approach for Non-Intrusive Load Monitoring
of Residential Appliance Based on Long Short Term Memory and Convolutional
Neural Networks
- URL: http://arxiv.org/abs/2104.07809v1
- Date: Thu, 15 Apr 2021 22:34:20 GMT
- Title: A Novel Hybrid Deep Learning Approach for Non-Intrusive Load Monitoring
of Residential Appliance Based on Long Short Term Memory and Convolutional
Neural Networks
- Authors: Sobhan Naderian
- Abstract summary: Energy disaggregation or nonintrusive load monitoring (NILM), is a single-input blind source discrimination problem.
This article presents a new approach for power disaggregation by using a deep recurrent long short term memory (LSTM) network combined with convolutional neural networks (CNN)
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Energy disaggregation or nonintrusive load monitoring (NILM), is a
single-input blind source discrimination problem, aims to interpret the mains
user electricity consumption into appliance level measurement. This article
presents a new approach for power disaggregation by using a deep recurrent long
short term memory (LSTM) network combined with convolutional neural networks
(CNN). Deep neural networks have been shown to be a significant way for these
types of problems because of their complexity and huge number of trainable
paramters. Hybrid method that proposed in the article could significantly
increase the overall accuracy of NILM because it benefits from both network
advantages. The proposed method used sequence-to-sequence learning, where the
input is a window of the mains and the output is a window of the target
appliance. The proposed deep neural network approach has been applied to
real-world household energy dataset "REFIT". The REFIT electrical load
measurements dataset described in this paper includes whole house aggregate
loads and nine individual appliance measurements at 8-second intervals per
house, collected continuously over a period of two years from 20 houses around
the UK. The proposed method achieve significant performance, improving accuracy
and F1-score measures by 95.93% and 80.93% ,respectively which demonstrates the
effectiveness and superiority of the proposed approach for home energy
monitoring. Comparison of proposed method and other recently published method
has been presented and discussed based on accuracy, number of considered
appliances and size of the deep neural network trainable parameters. The
proposed method shows remarkable performance compare to other previous methods.
Related papers
- Low-Frequency Load Identification using CNN-BiLSTM Attention Mechanism [0.0]
Non-intrusive Load Monitoring (NILM) is an established technique for effective and cost-efficient electricity consumption management.
This paper presents a hybrid learning approach, consisting of a convolutional neural network (CNN) and a bidirectional long short-term memory (BILSTM)
CNN-BILSTM model is adept at extracting both temporal (time-related) and spatial (location-related) features, allowing it to precisely identify energy consumption patterns at the appliance level.
arXiv Detail & Related papers (2023-11-14T21:02:27Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Evolutionary Deep Nets for Non-Intrusive Load Monitoring [5.415995239349699]
Non-Intrusive Load Monitoring (NILM) is an energy efficiency technique to track electricity consumption of an individual appliance in a household by one aggregated single.
Deep learning approaches are implemented to operate the desegregations.
arXiv Detail & Related papers (2023-03-06T22:47:40Z) - Deep Learning-Based Synchronization for Uplink NB-IoT [72.86843435313048]
We propose a neural network (NN)-based algorithm for device detection and time of arrival (ToA) estimation for the narrowband physical random-access channel (NPRACH) of narrowband internet of things (NB-IoT)
The introduced NN architecture leverages residual convolutional networks as well as knowledge of the preamble structure of the 5G New Radio (5G NR) specifications.
arXiv Detail & Related papers (2022-05-22T12:16:43Z) - Large-Scale Sequential Learning for Recommender and Engineering Systems [91.3755431537592]
In this thesis, we focus on the design of an automatic algorithms that provide personalized ranking by adapting to the current conditions.
For the former, we propose novel algorithm called SAROS that take into account both kinds of feedback for learning over the sequence of interactions.
The proposed idea of taking into account the neighbour lines shows statistically significant results in comparison with the initial approach for faults detection in power grid.
arXiv Detail & Related papers (2022-05-13T21:09:41Z) - Appliance Level Short-term Load Forecasting via Recurrent Neural Network [6.351541960369854]
We present an STLF algorithm for efficiently predicting the power consumption of individual electrical appliances.
The proposed method builds upon a powerful recurrent neural network (RNN) architecture in deep learning.
arXiv Detail & Related papers (2021-11-23T16:56:37Z) - Smart non-intrusive appliance identification using a novel local power
histogramming descriptor with an improved k-nearest neighbors classifier [2.389598109913753]
This paper proposes a smart NILM system based on a novel local power histogramming (LPH) descriptor.
Specifically, short local histograms are drawn to represent individual appliance consumption signatures.
An improved k-nearest neighbors (IKNN) algorithm is presented to reduce the learning time and improve the classification performance.
arXiv Detail & Related papers (2021-02-09T13:12:20Z) - Sequence-to-Sequence Load Disaggregation Using Multi-Scale Residual
Neural Network [4.094944573107066]
Non-Intrusive Load Monitoring (NILM) has received more and more attention as a cost-effective way to monitor electricity.
Deep neural networks has been shown a great potential in the field of load disaggregation.
arXiv Detail & Related papers (2020-09-25T17:41:28Z) - Continual Learning in Recurrent Neural Networks [67.05499844830231]
We evaluate the effectiveness of continual learning methods for processing sequential data with recurrent neural networks (RNNs)
We shed light on the particularities that arise when applying weight-importance methods, such as elastic weight consolidation, to RNNs.
We show that the performance of weight-importance methods is not directly affected by the length of the processed sequences, but rather by high working memory requirements.
arXiv Detail & Related papers (2020-06-22T10:05:12Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z) - MSE-Optimal Neural Network Initialization via Layer Fusion [68.72356718879428]
Deep neural networks achieve state-of-the-art performance for a range of classification and inference tasks.
The use of gradient combined nonvolutionity renders learning susceptible to novel problems.
We propose fusing neighboring layers of deeper networks that are trained with random variables.
arXiv Detail & Related papers (2020-01-28T18:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.