Fed-NILM: A Federated Learning-based Non-Intrusive Load Monitoring
Method for Privacy-Protection
- URL: http://arxiv.org/abs/2105.11085v1
- Date: Mon, 24 May 2021 04:12:10 GMT
- Title: Fed-NILM: A Federated Learning-based Non-Intrusive Load Monitoring
Method for Privacy-Protection
- Authors: Haijin Wang, Caomingzhe Si, Junhua Zhao
- Abstract summary: Non-intrusive load monitoring (NILM) decomposes the total load reading into appliance-level load signals.
Deep learning-based methods have been developed to accomplish NILM, and the training of deep neural networks (DNN) requires massive load data containing different types of appliances.
For local data owners with inadequate load data but expect to accomplish a promising model performance, the conduction of effective NILM co-modelling is increasingly significant.
To eliminate the potential risks, a novel NILM method named Fed-NILM ap-plying Federated Learning (FL) is proposed in this paper.
- Score: 0.1657441317977376
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-intrusive load monitoring (NILM) decomposes the total load reading into
appliance-level load signals. Many deep learning-based methods have been
developed to accomplish NILM, and the training of deep neural networks (DNN)
requires massive load data containing different types of appliances. For local
data owners with inadequate load data but expect to accomplish a promising
model performance, the conduction of effective NILM co-modelling is
increasingly significant. While during the cooperation of local data owners,
data exchange and centralized data storage may increase the risk of power
consumer privacy breaches. To eliminate the potential risks, a novel NILM
method named Fed-NILM ap-plying Federated Learning (FL) is proposed in this
paper. In Fed-NILM, local parameters instead of load data are shared among
local data owners. The global model is obtained by weighted averaging the
parameters. In the experiments, Fed-NILM is validated on two real-world
datasets. Besides, a comparison of Fed-NILM with locally-trained NILMs and the
centrally-trained one is conducted in both residential and industrial
scenarios. The experimental results show that Fed-NILM outperforms
locally-trained NILMs and approximate the centrally-trained NILM which is
trained on the entire load dataset without privacy preservation.
Related papers
- Entropy Law: The Story Behind Data Compression and LLM Performance [115.70395740286422]
We find that model performance is negatively correlated to the compression ratio of training data, which usually yields a lower training loss.
Based on the findings of the entropy law, we propose a quite efficient and universal data selection method.
We also present an interesting application of entropy law that can detect potential performance risks at the beginning of model training.
arXiv Detail & Related papers (2024-07-09T08:14:29Z) - Self-Play Fine-Tuning Converts Weak Language Models to Strong Language Models [52.98743860365194]
We propose a new fine-tuning method called Self-Play fIne-tuNing (SPIN)
At the heart of SPIN lies a self-play mechanism, where the LLM refines its capability by playing against instances of itself.
This sheds light on the promise of self-play, enabling the achievement of human-level performance in LLMs without the need for expert opponents.
arXiv Detail & Related papers (2024-01-02T18:53:13Z) - Federated Nearest Neighbor Machine Translation [66.8765098651988]
In this paper, we propose a novel federated nearest neighbor (FedNN) machine translation framework.
FedNN leverages one-round memorization-based interaction to share knowledge across different clients.
Experiments show that FedNN significantly reduces computational and communication costs compared with FedAvg.
arXiv Detail & Related papers (2023-02-23T18:04:07Z) - DP$^2$-NILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring [7.934421564157628]
Non-intrusive load monitoring (NILM) can help analyze electricity consumption behaviours of users.
Recent studies have proposed many novel NILM frameworks based on federated deep learning (FL)
arXiv Detail & Related papers (2022-06-30T18:18:25Z) - FedCL: Federated Contrastive Learning for Privacy-Preserving
Recommendation [98.5705258907774]
FedCL can exploit high-quality negative samples for effective model training with privacy well protected.
We first infer user embeddings from local user data through the local model on each client, and then perturb them with local differential privacy (LDP)
Since individual user embedding contains heavy noise due to LDP, we propose to cluster user embeddings on the server to mitigate the influence of noise.
arXiv Detail & Related papers (2022-04-21T02:37:10Z) - Learning Task-Aware Energy Disaggregation: a Federated Approach [1.52292571922932]
Non-intrusive load monitoring (NILM) aims to find individual devices' power consumption profiles based on aggregated meter measurements.
Yet collecting such residential load datasets require both huge efforts and customers' approval on sharing metering data.
We propose a decentralized and task-adaptive learning scheme for NILM tasks, where nested meta learning and federated learning steps are designed for learning task-specific models collectively.
arXiv Detail & Related papers (2022-04-14T05:53:41Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - FederatedNILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring based on Federated Deep Learning [8.230120882304723]
This paper develops a distributed and privacy-preserving federated deep learning framework for NILM (FederatedNILM)
FederatedNILM combines federated learning with a state-of-the-art deep learning architecture to conduct NILM for the classification of typical states of household appliances.
arXiv Detail & Related papers (2021-08-08T08:56:40Z) - FedNILM: Applying Federated Learning to NILM Applications at the Edge [17.322648858451995]
We present FedNILM, a practical FL paradigm for NILM applications at the edge client.
Specifically, FedNILM is designed to deliver privacy-preserving and personalized NILM services to large-scale edge clients.
Our experiments on real-world energy data show that, FedNILM is able to achieve personalized energy disaggregation with the state-of-the-art accuracy.
arXiv Detail & Related papers (2021-06-07T04:05:19Z) - A Federated Learning Framework for Non-Intrusive Load Monitoring [0.1657441317977376]
Non-intrusive load monitoring (NILM) aims at decomposing the total reading of the household power consumption into appliance-wise ones.
Data cooperation among utilities and DNOs who own the NILM data has been increasingly significant.
A framework to improve the performance of NILM with federated learning (FL) has been set up.
arXiv Detail & Related papers (2021-04-04T14:24:50Z) - WAFFLe: Weight Anonymized Factorization for Federated Learning [88.44939168851721]
In domains where data are sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices.
We propose Weight Anonymized Factorization for Federated Learning (WAFFLe), an approach that combines the Indian Buffet Process with a shared dictionary of weight factors for neural networks.
arXiv Detail & Related papers (2020-08-13T04:26:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.