MATNilm: Multi-appliance-task Non-intrusive Load Monitoring with Limited
Labeled Data
- URL: http://arxiv.org/abs/2307.14778v2
- Date: Sat, 29 Jul 2023 04:10:22 GMT
- Title: MATNilm: Multi-appliance-task Non-intrusive Load Monitoring with Limited
Labeled Data
- Authors: Jing Xiong, Tianqi Hong, Dongbo Zhao, and Yu Zhang
- Abstract summary: Existing approaches mainly focus on developing an individual model for each appliance.
In this paper, we propose a multi-appliance-task framework with a training-efficient sample augmentation scheme.
The relative errors can be reduced by more than 50% on average.
- Score: 4.460954839118025
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Non-intrusive load monitoring (NILM) identifies the status and power
consumption of various household appliances by disaggregating the total power
usage signal of an entire house. Efficient and accurate load monitoring
facilitates user profile establishment, intelligent household energy
management, and peak load shifting. This is beneficial for both the end-users
and utilities by improving the overall efficiency of a power distribution
network. Existing approaches mainly focus on developing an individual model for
each appliance. Those approaches typically rely on a large amount of
household-labeled data which is hard to collect. In this paper, we propose a
multi-appliance-task framework with a training-efficient sample augmentation
(SA) scheme that boosts the disaggregation performance with limited labeled
data. For each appliance, we develop a shared-hierarchical split structure for
its regression and classification tasks. In addition, we also propose a
two-dimensional attention mechanism in order to capture spatio-temporal
correlations among all appliances. With only one-day training data and limited
appliance operation profiles, the proposed SA algorithm can achieve comparable
test performance to the case of training with the full dataset. Finally,
simulation results show that our proposed approach features a significantly
improved performance over many baseline models. The relative errors can be
reduced by more than 50% on average. The codes of this work are available at
https://github.com/jxiong22/MATNilm
Related papers
- POMONAG: Pareto-Optimal Many-Objective Neural Architecture Generator [4.09225917049674]
Transferable NAS has emerged, generalizing the search process from dataset-dependent to task-dependent.
This paper introduces POMONAG, extending DiffusionNAG via a many-optimal diffusion process.
Results were validated on two search spaces -- NAS201 and MobileNetV3 -- and evaluated across 15 image classification datasets.
arXiv Detail & Related papers (2024-09-30T16:05:29Z) - Interpetable Target-Feature Aggregation for Multi-Task Learning based on Bias-Variance Analysis [53.38518232934096]
Multi-task learning (MTL) is a powerful machine learning paradigm designed to leverage shared knowledge across tasks to improve generalization and performance.
We propose an MTL approach at the intersection between task clustering and feature transformation based on a two-phase iterative aggregation of targets and features.
In both phases, a key aspect is to preserve the interpretability of the reduced targets and features through the aggregation with the mean, which is motivated by applications to Earth science.
arXiv Detail & Related papers (2024-06-12T08:30:16Z) - Self-Supervised Neuron Segmentation with Multi-Agent Reinforcement
Learning [53.00683059396803]
Mask image model (MIM) has been widely used due to its simplicity and effectiveness in recovering original information from masked images.
We propose a decision-based MIM that utilizes reinforcement learning (RL) to automatically search for optimal image masking ratio and masking strategy.
Our approach has a significant advantage over alternative self-supervised methods on the task of neuron segmentation.
arXiv Detail & Related papers (2023-10-06T10:40:46Z) - Energy-efficient Task Adaptation for NLP Edge Inference Leveraging
Heterogeneous Memory Architectures [68.91874045918112]
adapter-ALBERT is an efficient model optimization for maximal data reuse across different tasks.
We demonstrate the advantage of mapping the model to a heterogeneous on-chip memory architecture by performing simulations on a validated NLP edge accelerator.
arXiv Detail & Related papers (2023-03-25T14:40:59Z) - Learning Task-Aware Energy Disaggregation: a Federated Approach [1.52292571922932]
Non-intrusive load monitoring (NILM) aims to find individual devices' power consumption profiles based on aggregated meter measurements.
Yet collecting such residential load datasets require both huge efforts and customers' approval on sharing metering data.
We propose a decentralized and task-adaptive learning scheme for NILM tasks, where nested meta learning and federated learning steps are designed for learning task-specific models collectively.
arXiv Detail & Related papers (2022-04-14T05:53:41Z) - DANCE: DAta-Network Co-optimization for Efficient Segmentation Model
Training and Inference [85.02494022662505]
DANCE is an automated simultaneous data-network co-optimization for efficient segmentation model training and inference.
It integrates automated data slimming which adaptively downsamples/drops input images and controls their corresponding contribution to the training loss guided by the images' spatial complexity.
Experiments and ablating studies demonstrate that DANCE can achieve "all-win" towards efficient segmentation.
arXiv Detail & Related papers (2021-07-16T04:58:58Z) - Energy-Efficient and Federated Meta-Learning via Projected Stochastic
Gradient Ascent [79.58680275615752]
We propose an energy-efficient federated meta-learning framework.
We assume each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model.
arXiv Detail & Related papers (2021-05-31T08:15:44Z) - Energy Disaggregation using Variational Autoencoders [11.940343835617046]
Non-intrusive load monitoring (NILM) is a technique that uses a single sensor to measure the total power consumption of a building.
Recent disaggregation algorithms have significantly improved the performance of NILM systems.
We propose an energy disaggregation approach based on the variational autoencoders (VAE) framework.
arXiv Detail & Related papers (2021-03-22T20:53:36Z) - Smart non-intrusive appliance identification using a novel local power
histogramming descriptor with an improved k-nearest neighbors classifier [2.389598109913753]
This paper proposes a smart NILM system based on a novel local power histogramming (LPH) descriptor.
Specifically, short local histograms are drawn to represent individual appliance consumption signatures.
An improved k-nearest neighbors (IKNN) algorithm is presented to reduce the learning time and improve the classification performance.
arXiv Detail & Related papers (2021-02-09T13:12:20Z) - Incorporating Coincidental Water Data into Non-intrusive Load Monitoring [0.0]
We propose an event-based classification process to extract power signals of appliances with exclusive non-overlapping power values.
Two deep learning models, which consider the water consumption of some appliances as a novel signature in the network, are utilized to distinguish between appliances with overlapping power values.
In addition to power disaggregation, the proposed process as well extracts the water consumption profiles of specific appliances.
arXiv Detail & Related papers (2021-01-18T17:49:39Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.