DP$^2$-NILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring
- URL: http://arxiv.org/abs/2207.00041v1
- Date: Thu, 30 Jun 2022 18:18:25 GMT
- Title: DP$^2$-NILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring
- Authors: Shuang Dai and Fanlin Meng and Qian Wang and Xizhong Chen
- Abstract summary: Non-intrusive load monitoring (NILM) can help analyze electricity consumption behaviours of users.
Recent studies have proposed many novel NILM frameworks based on federated deep learning (FL)
- Score: 7.934421564157628
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-intrusive load monitoring (NILM), which usually utilizes machine learning
methods and is effective in disaggregating smart meter readings from the
household-level into appliance-level consumption, can help analyze electricity
consumption behaviours of users and enable practical smart energy and smart
grid applications. Recent studies have proposed many novel NILM frameworks
based on federated deep learning (FL). However, there lacks comprehensive
research exploring the utility optimization schemes and the privacy-preserving
schemes in different FL-based NILM application scenarios. In this paper, we
make the first attempt to conduct FL-based NILM focusing on both the utility
optimization and the privacy-preserving by developing a distributed and
privacy-preserving NILM (DP2-NILM) framework and carrying out comparative
experiments on practical NILM scenarios based on real-world smart meter
datasets. Specifically, two alternative federated learning strategies are
examined in the utility optimization schemes, i.e., the FedAvg and the FedProx.
Moreover, different levels of privacy guarantees, i.e., the local differential
privacy federated learning and the global differential privacy federated
learning are provided in the DP2-NILM. Extensive comparison experiments are
conducted on three real-world datasets to evaluate the proposed framework.
Related papers
- Privacy-preserving Federated Primal-dual Learning for Non-convex and Non-smooth Problems with Model Sparsification [51.04894019092156]
Federated learning (FL) has been recognized as a rapidly growing area, where the model is trained over clients under the FL orchestration (PS)
In this paper, we propose a novel primal sparification algorithm for and guarantee non-smooth FL problems.
Its unique insightful properties and its analyses are also presented.
arXiv Detail & Related papers (2023-10-30T14:15:47Z) - Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly [62.473245910234304]
This paper takes a hardware-centric approach to explore how Large Language Models can be brought to modern edge computing systems.
We provide a micro-level hardware benchmark, compare the model FLOP utilization to a state-of-the-art data center GPU, and study the network utilization in realistic conditions.
arXiv Detail & Related papers (2023-10-04T20:27:20Z) - Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - FederatedScope-LLM: A Comprehensive Package for Fine-tuning Large
Language Models in Federated Learning [70.38817963253034]
This paper first discusses these challenges of federated fine-tuning LLMs, and introduces our package FS-LLM as a main contribution.
We provide comprehensive federated parameter-efficient fine-tuning algorithm implementations and versatile programming interfaces for future extension in FL scenarios.
We conduct extensive experiments to validate the effectiveness of FS-LLM and benchmark advanced LLMs with state-of-the-art parameter-efficient fine-tuning algorithms in FL settings.
arXiv Detail & Related papers (2023-09-01T09:40:36Z) - Towards Building the Federated GPT: Federated Instruction Tuning [66.7900343035733]
This paper introduces Federated Instruction Tuning (FedIT) as the learning framework for the instruction tuning of large language models (LLMs)
We demonstrate that by exploiting the heterogeneous and diverse sets of instructions on the client's end with FedIT, we improved the performance of LLMs compared to centralized training with only limited local instructions.
arXiv Detail & Related papers (2023-05-09T17:42:34Z) - Optimizing Privacy, Utility and Efficiency in Constrained
Multi-Objective Federated Learning [20.627157142499378]
We develop two improved CMOFL algorithms based on NSGA-II and PSL.
We design specific measurements of privacy leakage, utility loss, and training cost for three privacy protection mechanisms.
Empirical experiments conducted under each of the three protection mechanisms demonstrate the effectiveness of our proposed algorithms.
arXiv Detail & Related papers (2023-04-29T17:55:38Z) - FederatedNILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring based on Federated Deep Learning [8.230120882304723]
This paper develops a distributed and privacy-preserving federated deep learning framework for NILM (FederatedNILM)
FederatedNILM combines federated learning with a state-of-the-art deep learning architecture to conduct NILM for the classification of typical states of household appliances.
arXiv Detail & Related papers (2021-08-08T08:56:40Z) - FedNILM: Applying Federated Learning to NILM Applications at the Edge [17.322648858451995]
We present FedNILM, a practical FL paradigm for NILM applications at the edge client.
Specifically, FedNILM is designed to deliver privacy-preserving and personalized NILM services to large-scale edge clients.
Our experiments on real-world energy data show that, FedNILM is able to achieve personalized energy disaggregation with the state-of-the-art accuracy.
arXiv Detail & Related papers (2021-06-07T04:05:19Z) - Fed-NILM: A Federated Learning-based Non-Intrusive Load Monitoring
Method for Privacy-Protection [0.1657441317977376]
Non-intrusive load monitoring (NILM) decomposes the total load reading into appliance-level load signals.
Deep learning-based methods have been developed to accomplish NILM, and the training of deep neural networks (DNN) requires massive load data containing different types of appliances.
For local data owners with inadequate load data but expect to accomplish a promising model performance, the conduction of effective NILM co-modelling is increasingly significant.
To eliminate the potential risks, a novel NILM method named Fed-NILM ap-plying Federated Learning (FL) is proposed in this paper.
arXiv Detail & Related papers (2021-05-24T04:12:10Z) - A Federated Learning Framework for Non-Intrusive Load Monitoring [0.1657441317977376]
Non-intrusive load monitoring (NILM) aims at decomposing the total reading of the household power consumption into appliance-wise ones.
Data cooperation among utilities and DNOs who own the NILM data has been increasingly significant.
A framework to improve the performance of NILM with federated learning (FL) has been set up.
arXiv Detail & Related papers (2021-04-04T14:24:50Z) - Voting-based Approaches For Differentially Private Federated Learning [87.2255217230752]
This work is inspired by knowledge transfer non-federated privacy learning from Papernot et al.
We design two new DPFL schemes, by voting among the data labels returned from each local model, instead of averaging the gradients.
Our approaches significantly improve the privacy-utility trade-off over the state-of-the-arts in DPFL.
arXiv Detail & Related papers (2020-10-09T23:55:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.