Adversarial Predictions of Data Distributions Across Federated
Internet-of-Things Devices
- URL: http://arxiv.org/abs/2308.14658v1
- Date: Mon, 28 Aug 2023 15:40:50 GMT
- Title: Adversarial Predictions of Data Distributions Across Federated
Internet-of-Things Devices
- Authors: Samir Rajani, Dario Dematties, Nathaniel Hudson, Kyle Chard, Nicola
Ferrier, Rajesh Sankaran, Peter Beckman
- Abstract summary: Federated learning (FL) is becoming the default approach for training machine learning models across decentralized Internet-of-Things (IoT) devices.
In this work, we demonstrate that the model weights shared in FL can expose revealing information about the local data distributions of IoT devices.
This leakage could expose sensitive information to malicious actors in a distributed system.
- Score: 3.119217042097909
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning (FL) is increasingly becoming the default approach for
training machine learning models across decentralized Internet-of-Things (IoT)
devices. A key advantage of FL is that no raw data are communicated across the
network, providing an immediate layer of privacy. Despite this, recent works
have demonstrated that data reconstruction can be done with the locally trained
model updates which are communicated across the network. However, many of these
works have limitations with regard to how the gradients are computed in
backpropagation. In this work, we demonstrate that the model weights shared in
FL can expose revealing information about the local data distributions of IoT
devices. This leakage could expose sensitive information to malicious actors in
a distributed system. We further discuss results which show that injecting
noise into model weights is ineffective at preventing data leakage without
seriously harming the global model accuracy.
Related papers
- Effective Intrusion Detection in Heterogeneous Internet-of-Things Networks via Ensemble Knowledge Distillation-based Federated Learning [52.6706505729803]
We introduce Federated Learning (FL) to collaboratively train a decentralized shared model of Intrusion Detection Systems (IDS)
FLEKD enables a more flexible aggregation method than conventional model fusion techniques.
Experiment results show that the proposed approach outperforms local training and traditional FL in terms of both speed and performance.
arXiv Detail & Related papers (2024-01-22T14:16:37Z) - Adaptive Model Pruning and Personalization for Federated Learning over
Wireless Networks [72.59891661768177]
Federated learning (FL) enables distributed learning across edge devices while protecting data privacy.
We consider a FL framework with partial model pruning and personalization to overcome these challenges.
This framework splits the learning model into a global part with model pruning shared with all devices to learn data representations and a personalized part to be fine-tuned for a specific device.
arXiv Detail & Related papers (2023-09-04T21:10:45Z) - Unsupervised anomalies detection in IIoT edge devices networks using
federated learning [0.0]
Federated learning(FL) as a distributed machine learning approach performs training of a machine learning model on the device that gathered the data itself.
In this paper, we leverage the benefits of FL and implemented Fedavg algorithm on a recent dataset that represent the modern IoT/ IIoT device networks.
We also evaluated some shortcomings of Fedavg such as unfairness that happens during the training when struggling devices do not participate for every stage of training.
arXiv Detail & Related papers (2023-08-23T14:53:38Z) - Federated Deep Learning for Intrusion Detection in IoT Networks [1.3097853961043058]
A common approach to implementing AI-based Intrusion Detection systems (IDSs) in distributed IoT systems is in a centralised manner.
This approach may violate data privacy and prohibit IDS scalability.
We design an experiment representative of the real world and evaluate the performance of an FL-based IDS.
arXiv Detail & Related papers (2023-06-05T09:08:24Z) - FLARE: Detection and Mitigation of Concept Drift for Federated Learning
based IoT Deployments [2.7776688429637466]
FLARE is a lightweight dual-scheduler FL framework that conditionally transfers training data and deploys models between edge and sensor endpoints.
We show that FLARE can significantly reduce the amount of data exchanged between edge and sensor nodes compared to fixed-interval scheduling methods.
It can successfully detect concept drift reactively with at least a 16x reduction in latency.
arXiv Detail & Related papers (2023-05-15T10:09:07Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - Fed-FSNet: Mitigating Non-I.I.D. Federated Learning via Fuzzy
Synthesizing Network [19.23943687834319]
Federated learning (FL) has emerged as a promising privacy-preserving distributed machine learning framework.
We propose a novel FL training framework, dubbed Fed-FSNet, using a properly designed Fuzzy Synthesizing Network (FSNet) to mitigate the Non-I.I.D. at-the-source issue.
arXiv Detail & Related papers (2022-08-21T18:40:51Z) - Over-the-Air Federated Learning from Heterogeneous Data [107.05618009955094]
Federated learning (FL) is a framework for distributed learning of centralized models.
We develop a Convergent OTA FL (COTAF) algorithm which enhances the common local gradient descent (SGD) FL algorithm.
We numerically show that the precoding induced by COTAF notably improves the convergence rate and the accuracy of models trained via OTA FL.
arXiv Detail & Related papers (2020-09-27T08:28:25Z) - WAFFLe: Weight Anonymized Factorization for Federated Learning [88.44939168851721]
In domains where data are sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices.
We propose Weight Anonymized Factorization for Federated Learning (WAFFLe), an approach that combines the Indian Buffet Process with a shared dictionary of weight factors for neural networks.
arXiv Detail & Related papers (2020-08-13T04:26:31Z) - Federated Learning With Quantized Global Model Updates [84.55126371346452]
We study federated learning, which enables mobile devices to utilize their local datasets to train a global model.
We introduce a lossy FL (LFL) algorithm, in which both the global model and the local model updates are quantized before being transmitted.
arXiv Detail & Related papers (2020-06-18T16:55:20Z) - Federated Learning with Cooperating Devices: A Consensus Approach for
Massive IoT Networks [8.456633924613456]
Federated learning (FL) is emerging as a new paradigm to train machine learning models in distributed systems.
The paper proposes a fully distributed (or server-less) learning approach: the proposed FL algorithms leverage the cooperation of devices that perform data operations inside the network.
The approach lays the groundwork for integration of FL within 5G and beyond networks characterized by decentralized connectivity and computing.
arXiv Detail & Related papers (2019-12-27T15:16:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.