UFed-GAN: A Secure Federated Learning Framework with Constrained
Computation and Unlabeled Data
- URL: http://arxiv.org/abs/2308.05870v1
- Date: Thu, 10 Aug 2023 22:52:13 GMT
- Title: UFed-GAN: A Secure Federated Learning Framework with Constrained
Computation and Unlabeled Data
- Authors: Achintha Wijesinghe, Songyang Zhang, Siyu Qi, Zhi Ding
- Abstract summary: We propose a novel framework of UFed-GAN: Unsupervised Federated Generative Adversarial Network, which can capture user-side data distribution without local classification training.
Our experimental results demonstrate the strong potential of UFed-GAN in addressing limited computational resources and unlabeled data while preserving privacy.
- Score: 50.13595312140533
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: To satisfy the broad applications and insatiable hunger for deploying low
latency multimedia data classification and data privacy in a cloud-based
setting, federated learning (FL) has emerged as an important learning paradigm.
For the practical cases involving limited computational power and only
unlabeled data in many wireless communications applications, this work
investigates FL paradigm in a resource-constrained and label-missing
environment. Specifically, we propose a novel framework of UFed-GAN:
Unsupervised Federated Generative Adversarial Network, which can capture
user-side data distribution without local classification training. We also
analyze the convergence and privacy of the proposed UFed-GAN. Our experimental
results demonstrate the strong potential of UFed-GAN in addressing limited
computational resources and unlabeled data while preserving privacy.
Related papers
- Privacy-preserving design of graph neural networks with applications to
vertical federated learning [56.74455367682945]
We present an end-to-end graph representation learning framework called VESPER.
VESPER is capable of training high-performance GNN models over both sparse and dense graphs under reasonable privacy budgets.
arXiv Detail & Related papers (2023-10-31T15:34:59Z) - Privacy-preserving Federated Primal-dual Learning for Non-convex and Non-smooth Problems with Model Sparsification [51.04894019092156]
Federated learning (FL) has been recognized as a rapidly growing area, where the model is trained over clients under the FL orchestration (PS)
In this paper, we propose a novel primal sparification algorithm for and guarantee non-smooth FL problems.
Its unique insightful properties and its analyses are also presented.
arXiv Detail & Related papers (2023-10-30T14:15:47Z) - Federated Learning Under Restricted User Availability [3.0846824529023387]
Non-uniform availability or participation of users is unavoidable due to an adverse or environment.
We propose a new formulation of the FL problem which effectively captures and mitigates limited participation of data originating from infrequent, or restricted users.
Our experiments on synthetic and benchmark datasets show that the proposed approach significantly improved performance as compared with standard FL.
arXiv Detail & Related papers (2023-09-25T14:40:27Z) - Analysis and Optimization of Wireless Federated Learning with Data
Heterogeneity [72.85248553787538]
This paper focuses on performance analysis and optimization for wireless FL, considering data heterogeneity, combined with wireless resource allocation.
We formulate the loss function minimization problem, under constraints on long-term energy consumption and latency, and jointly optimize client scheduling, resource allocation, and the number of local training epochs (CRE)
Experiments on real-world datasets demonstrate that the proposed algorithm outperforms other benchmarks in terms of the learning accuracy and energy consumption.
arXiv Detail & Related papers (2023-08-04T04:18:01Z) - PS-FedGAN: An Efficient Federated Learning Framework Based on Partially
Shared Generative Adversarial Networks For Data Privacy [56.347786940414935]
Federated Learning (FL) has emerged as an effective learning paradigm for distributed computation.
This work proposes a novel FL framework that requires only partial GAN model sharing.
Named as PS-FedGAN, this new framework enhances the GAN releasing and training mechanism to address heterogeneous data distributions.
arXiv Detail & Related papers (2023-05-19T05:39:40Z) - Benchmarking FedAvg and FedCurv for Image Classification Tasks [1.376408511310322]
This paper focuses on the problem of statistical heterogeneity of the data in the same federated network.
Several Federated Learning algorithms, such as FedAvg, FedProx and Federated Curvature (FedCurv) have already been proposed.
As a side product of this work, we release the non-IID version of the datasets we used so to facilitate further comparisons from the FL community.
arXiv Detail & Related papers (2023-03-31T10:13:01Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Weight Divergence Driven Divide-and-Conquer Approach for Optimal
Federated Learning from non-IID Data [0.0]
Federated Learning allows training of data stored in distributed devices without the need for centralizing training data.
We propose a novel Divide-and-Conquer training methodology that enables the use of the popular FedAvg aggregation algorithm.
arXiv Detail & Related papers (2021-06-28T09:34:20Z) - Communication-Computation Efficient Secure Aggregation for Federated
Learning [23.924656276456503]
Federated learning is a way to train neural networks using data distributed over multiple nodes without the need for the nodes to share data.
A recent solution based on the secure aggregation primitive enabled privacy-preserving federated learning, but at the expense of significant extra communication/computational resources.
We propose communication-computation efficient secure aggregation which substantially reduces the amount of communication/computational resources.
arXiv Detail & Related papers (2020-12-10T03:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.