MSDC: Exploiting Multi-State Power Consumption in Non-intrusive Load
Monitoring based on A Dual-CNN Model
- URL: http://arxiv.org/abs/2302.05565v1
- Date: Sat, 11 Feb 2023 01:56:54 GMT
- Title: MSDC: Exploiting Multi-State Power Consumption in Non-intrusive Load
Monitoring based on A Dual-CNN Model
- Authors: Jialing He, Jiamou Liu, Zijian Zhang, Yang Chen, Yiwei Liu, Bakh
Khoussainov, and Liehuang Zhu
- Abstract summary: Non-intrusive load monitoring (NILM) aims to decompose aggregated electrical usage signal into appliance-specific power consumption.
We design a new neural NILM model Multi-State Dual CNN (MSDC)
MSDC explicitly extracts information about the appliance's multiple states and state transitions, which in turn regulates the prediction of signals for appliances.
- Score: 18.86649389838833
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Non-intrusive load monitoring (NILM) aims to decompose aggregated electrical
usage signal into appliance-specific power consumption and it amounts to a
classical example of blind source separation tasks. Leveraging recent progress
on deep learning techniques, we design a new neural NILM model Multi-State Dual
CNN (MSDC). Different from previous models, MSDC explicitly extracts
information about the appliance's multiple states and state transitions, which
in turn regulates the prediction of signals for appliances. More specifically,
we employ a dual-CNN architecture: one CNN for outputting state distributions
and the other for predicting the power of each state. A new technique is
invented that utilizes conditional random fields (CRF) to capture state
transitions. Experiments on two real-world datasets REDD and UK-DALE
demonstrate that our model significantly outperform state-of-the-art models
while having good generalization capacity, achieving 6%-10% MAE gain and
33%-51% SAE gain to unseen appliances.
Related papers
- Multistep Inverse Is Not All You Need [87.62730694973696]
In real-world control settings, the observation space is often unnecessarily high-dimensional and subject to time-correlated noise.
It is therefore desirable to learn an encoder to map the observation space to a simpler space of control-relevant variables.
We propose a new algorithm, ACDF, which combines multistep-inverse prediction with a latent forward model.
arXiv Detail & Related papers (2024-03-18T16:36:01Z) - Deep Convolutional Neural Networks for Short-Term Multi-Energy Demand
Prediction of Integrated Energy Systems [0.0]
This paper develops six novel prediction models based on Convolutional Neural Networks (CNNs) for forecasting multi-energy power consumptions.
The models are applied in a comprehensive manner on a novel integrated electrical, heat and gas network system.
arXiv Detail & Related papers (2023-12-24T14:56:23Z) - When Parameter-efficient Tuning Meets General-purpose Vision-language
Models [65.19127815275307]
PETAL revolutionizes the training process by requiring only 0.5% of the total parameters, achieved through a unique mode approximation technique.
Our experiments reveal that PETAL not only outperforms current state-of-the-art methods in most scenarios but also surpasses full fine-tuning models in effectiveness.
arXiv Detail & Related papers (2023-12-16T17:13:08Z) - MATNilm: Multi-appliance-task Non-intrusive Load Monitoring with Limited
Labeled Data [4.460954839118025]
Existing approaches mainly focus on developing an individual model for each appliance.
In this paper, we propose a multi-appliance-task framework with a training-efficient sample augmentation scheme.
The relative errors can be reduced by more than 50% on average.
arXiv Detail & Related papers (2023-07-27T11:14:11Z) - A Generative Approach for Production-Aware Industrial Network Traffic
Modeling [70.46446906513677]
We investigate the network traffic data generated from a laser cutting machine deployed in a Trumpf factory in Germany.
We analyze the traffic statistics, capture the dependencies between the internal states of the machine, and model the network traffic as a production state dependent process.
We compare the performance of various generative models including variational autoencoder (VAE), conditional variational autoencoder (CVAE), and generative adversarial network (GAN)
arXiv Detail & Related papers (2022-11-11T09:46:58Z) - Conv-NILM-Net, a causal and multi-appliance model for energy source
separation [1.1355370218310157]
Non-Intrusive Load Monitoring seeks to save energy by estimating individual appliance power usage from a single aggregate measurement.
Deep neural networks have become increasingly popular in attempting to solve NILM problems.
We propose Conv-NILM-net, a fully convolutional framework for end-to-end NILM.
arXiv Detail & Related papers (2022-08-03T15:59:32Z) - Value-Consistent Representation Learning for Data-Efficient
Reinforcement Learning [105.70602423944148]
We propose a novel method, called value-consistent representation learning (VCR), to learn representations that are directly related to decision-making.
Instead of aligning this imagined state with a real state returned by the environment, VCR applies a $Q$-value head on both states and obtains two distributions of action values.
It has been demonstrated that our methods achieve new state-of-the-art performance for search-free RL algorithms.
arXiv Detail & Related papers (2022-06-25T03:02:25Z) - Parallel Successive Learning for Dynamic Distributed Model Training over
Heterogeneous Wireless Networks [50.68446003616802]
Federated learning (FedL) has emerged as a popular technique for distributing model training over a set of wireless devices.
We develop parallel successive learning (PSL), which expands the FedL architecture along three dimensions.
Our analysis sheds light on the notion of cold vs. warmed up models, and model inertia in distributed machine learning.
arXiv Detail & Related papers (2022-02-07T05:11:01Z) - More Behind Your Electricity Bill: a Dual-DNN Approach to Non-Intrusive
Load Monitoring [17.516784821462522]
Non-intrusive load monitoring (NILM) aims to decompose the household energy consumption into itemised energy usage of individual appliances.
Recent investigations have shown that deep neural networks (DNNs) based approaches are promising for the NILM task.
arXiv Detail & Related papers (2021-06-01T08:06:33Z) - Diversity inducing Information Bottleneck in Model Ensembles [73.80615604822435]
In this paper, we target the problem of generating effective ensembles of neural networks by encouraging diversity in prediction.
We explicitly optimize a diversity inducing adversarial loss for learning latent variables and thereby obtain diversity in the output predictions necessary for modeling multi-modal data.
Compared to the most competitive baselines, we show significant improvements in classification accuracy, under a shift in the data distribution.
arXiv Detail & Related papers (2020-03-10T03:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.