FedNILM: Applying Federated Learning to NILM Applications at the Edge
- URL: http://arxiv.org/abs/2106.07751v1
- Date: Mon, 7 Jun 2021 04:05:19 GMT
- Title: FedNILM: Applying Federated Learning to NILM Applications at the Edge
- Authors: Yu Zhang, Guoming Tang, Qianyi Huang, Yi Wang, Xudong Wang, Jiadong
Lou
- Abstract summary: We present FedNILM, a practical FL paradigm for NILM applications at the edge client.
Specifically, FedNILM is designed to deliver privacy-preserving and personalized NILM services to large-scale edge clients.
Our experiments on real-world energy data show that, FedNILM is able to achieve personalized energy disaggregation with the state-of-the-art accuracy.
- Score: 17.322648858451995
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Non-intrusive load monitoring (NILM) helps disaggregate the household's main
electricity consumption to energy usages of individual appliances, thus greatly
cutting down the cost in fine-grained household load monitoring. To address the
arisen privacy concern in NILM applications, federated learning (FL) could be
leveraged for NILM model training and sharing. When applying the FL paradigm in
real-world NILM applications, however, we are faced with the challenges of edge
resource restriction, edge model personalization and edge training data
scarcity.
In this paper we present FedNILM, a practical FL paradigm for NILM
applications at the edge client. Specifically, FedNILM is designed to deliver
privacy-preserving and personalized NILM services to large-scale edge clients,
by leveraging i) secure data aggregation through federated learning, ii)
efficient cloud model compression via filter pruning and multi-task learning,
and iii) personalized edge model building with unsupervised transfer learning.
Our experiments on real-world energy data show that, FedNILM is able to achieve
personalized energy disaggregation with the state-of-the-art accuracy, while
ensuring privacy preserving at the edge client.
Related papers
- Federated Sequence-to-Sequence Learning for Load Disaggregation from Unbalanced Low-Resolution Smart Meter Data [5.460776507522276]
Non-Intrusive Load Monitoring (NILM) can enhance energy awareness and provide valuable insights for energy program design.
Existing NILM methods often rely on specialized devices to retrieve high-sampling complex signal data.
We propose a new approach using easily accessible weather data to achieve load disaggregation for a total of 12 appliances.
arXiv Detail & Related papers (2024-08-15T13:04:49Z) - Self-Play Fine-Tuning Converts Weak Language Models to Strong Language Models [52.98743860365194]
We propose a new fine-tuning method called Self-Play fIne-tuNing (SPIN)
At the heart of SPIN lies a self-play mechanism, where the LLM refines its capability by playing against instances of itself.
This sheds light on the promise of self-play, enabling the achievement of human-level performance in LLMs without the need for expert opponents.
arXiv Detail & Related papers (2024-01-02T18:53:13Z) - ZooPFL: Exploring Black-box Foundation Models for Personalized Federated
Learning [95.64041188351393]
This paper endeavors to solve both the challenges of limited resources and personalization.
We propose a method named ZOOPFL that uses Zeroth-Order Optimization for Personalized Federated Learning.
To reduce the computation costs and enhance personalization, we propose input surgery to incorporate an auto-encoder with low-dimensional and client-specific embeddings.
arXiv Detail & Related papers (2023-10-08T12:26:13Z) - Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly [62.473245910234304]
This paper takes a hardware-centric approach to explore how Large Language Models can be brought to modern edge computing systems.
We provide a micro-level hardware benchmark, compare the model FLOP utilization to a state-of-the-art data center GPU, and study the network utilization in realistic conditions.
arXiv Detail & Related papers (2023-10-04T20:27:20Z) - Towards Building the Federated GPT: Federated Instruction Tuning [66.7900343035733]
This paper introduces Federated Instruction Tuning (FedIT) as the learning framework for the instruction tuning of large language models (LLMs)
We demonstrate that by exploiting the heterogeneous and diverse sets of instructions on the client's end with FedIT, we improved the performance of LLMs compared to centralized training with only limited local instructions.
arXiv Detail & Related papers (2023-05-09T17:42:34Z) - DP$^2$-NILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring [7.934421564157628]
Non-intrusive load monitoring (NILM) can help analyze electricity consumption behaviours of users.
Recent studies have proposed many novel NILM frameworks based on federated deep learning (FL)
arXiv Detail & Related papers (2022-06-30T18:18:25Z) - FederatedNILM: A Distributed and Privacy-preserving Framework for
Non-intrusive Load Monitoring based on Federated Deep Learning [8.230120882304723]
This paper develops a distributed and privacy-preserving federated deep learning framework for NILM (FederatedNILM)
FederatedNILM combines federated learning with a state-of-the-art deep learning architecture to conduct NILM for the classification of typical states of household appliances.
arXiv Detail & Related papers (2021-08-08T08:56:40Z) - Adversarial Energy Disaggregation for Non-intrusive Load Monitoring [78.47901044638525]
Energy disaggregation, also known as non-intrusive load monitoring (NILM), challenges the problem of separating the whole-home electricity usage into appliance-specific individual consumptions.
Recent advances reveal that deep neural networks (DNNs) can get favorable performance for NILM.
We introduce the idea of adversarial learning into NILM, which is new for the energy disaggregation task.
arXiv Detail & Related papers (2021-08-02T03:56:35Z) - Fed-NILM: A Federated Learning-based Non-Intrusive Load Monitoring
Method for Privacy-Protection [0.1657441317977376]
Non-intrusive load monitoring (NILM) decomposes the total load reading into appliance-level load signals.
Deep learning-based methods have been developed to accomplish NILM, and the training of deep neural networks (DNN) requires massive load data containing different types of appliances.
For local data owners with inadequate load data but expect to accomplish a promising model performance, the conduction of effective NILM co-modelling is increasingly significant.
To eliminate the potential risks, a novel NILM method named Fed-NILM ap-plying Federated Learning (FL) is proposed in this paper.
arXiv Detail & Related papers (2021-05-24T04:12:10Z) - A Federated Learning Framework for Non-Intrusive Load Monitoring [0.1657441317977376]
Non-intrusive load monitoring (NILM) aims at decomposing the total reading of the household power consumption into appliance-wise ones.
Data cooperation among utilities and DNOs who own the NILM data has been increasingly significant.
A framework to improve the performance of NILM with federated learning (FL) has been set up.
arXiv Detail & Related papers (2021-04-04T14:24:50Z) - Toward Smart Security Enhancement of Federated Learning Networks [109.20054130698797]
In this paper, we review the vulnerabilities of federated learning networks (FLNs) and give an overview of poisoning attacks.
We present a smart security enhancement framework for FLNs.
Deep reinforcement learning is applied to learn the behaving patterns of the edge devices (EDs) that can provide benign training results.
arXiv Detail & Related papers (2020-08-19T08:46:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.