FedSKETCH: Communication-Efficient and Private Federated Learning via
Sketching
- URL: http://arxiv.org/abs/2008.04975v1
- Date: Tue, 11 Aug 2020 19:22:48 GMT
- Title: FedSKETCH: Communication-Efficient and Private Federated Learning via
Sketching
- Authors: Farzin Haddadpour, Belhal Karimi, Ping Li, Xiaoyun Li
- Abstract summary: Communication complexity and privacy are the two key challenges in Federated Learning.
We introduce FedSKETCH and FedSKETCHGATE algorithms to address both challenges in Federated learning jointly.
- Score: 33.54413645276686
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Communication complexity and privacy are the two key challenges in Federated
Learning where the goal is to perform a distributed learning through a large
volume of devices. In this work, we introduce FedSKETCH and FedSKETCHGATE
algorithms to address both challenges in Federated learning jointly, where
these algorithms are intended to be used for homogeneous and heterogeneous data
distribution settings respectively. The key idea is to compress the
accumulation of local gradients using count sketch, therefore, the server does
not have access to the gradients themselves which provides privacy.
Furthermore, due to the lower dimension of sketching used, our method exhibits
communication-efficiency property as well. We provide, for the aforementioned
schemes, sharp convergence guarantees.
Finally, we back up our theory with various set of experiments.
Related papers
- Accelerated Stochastic ExtraGradient: Mixing Hessian and Gradient Similarity to Reduce Communication in Distributed and Federated Learning [50.382793324572845]
Distributed computing involves communication between devices, which requires solving two key problems: efficiency and privacy.
In this paper, we analyze a new method that incorporates the ideas of using data similarity and clients sampling.
To address privacy concerns, we apply the technique of additional noise and analyze its impact on the convergence of the proposed method.
arXiv Detail & Related papers (2024-09-22T00:49:10Z) - An Empirical Study of Efficiency and Privacy of Federated Learning
Algorithms [2.994794762377111]
In today's world, the rapid expansion of IoT networks and the proliferation of smart devices have resulted in the generation of substantial amounts of heterogeneous data.
To handle this data effectively, advanced data processing technologies are necessary to guarantee the preservation of both privacy and efficiency.
Federated learning emerged as a distributed learning method that trains models locally and aggregates them on a server to preserve data privacy.
arXiv Detail & Related papers (2023-12-24T00:13:41Z) - Federated Compositional Deep AUC Maximization [58.25078060952361]
We develop a novel federated learning method for imbalanced data by directly optimizing the area under curve (AUC) score.
To the best of our knowledge, this is the first work to achieve such favorable theoretical results.
arXiv Detail & Related papers (2023-04-20T05:49:41Z) - SoteriaFL: A Unified Framework for Private Federated Learning with
Communication Compression [40.646108010388986]
We propose a unified framework that enhances the communication efficiency of private federated learning with communication compression.
We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity.
arXiv Detail & Related papers (2022-06-20T16:47:58Z) - FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for
Federated Learning on Non-IID Data [69.0785021613868]
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
We propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies.
This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT)
arXiv Detail & Related papers (2022-05-19T03:32:03Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Differentially Private Federated Learning on Heterogeneous Data [10.431137628048356]
Federated Learning (FL) is a paradigm for large-scale distributed learning.
It faces two key challenges: (i) efficient training from highly heterogeneous user data, and (ii) protecting the privacy of participating users.
We propose a novel FL approach to tackle these two challenges together by incorporating Differential Privacy (DP) constraints.
arXiv Detail & Related papers (2021-11-17T18:23:49Z) - Communication-Computation Efficient Secure Aggregation for Federated
Learning [23.924656276456503]
Federated learning is a way to train neural networks using data distributed over multiple nodes without the need for the nodes to share data.
A recent solution based on the secure aggregation primitive enabled privacy-preserving federated learning, but at the expense of significant extra communication/computational resources.
We propose communication-computation efficient secure aggregation which substantially reduces the amount of communication/computational resources.
arXiv Detail & Related papers (2020-12-10T03:17:50Z) - Second-Order Guarantees in Federated Learning [49.17137296715029]
Federated learning is a useful centralized learning from distributed data.
This paper focuses on the second-order optimality algorithms in centralized and decentralized settings.
arXiv Detail & Related papers (2020-12-02T19:30:08Z) - A Theoretical Perspective on Differentially Private Federated Multi-task
Learning [12.935153199667987]
collaborative learning models need to be developed with respect to both privacy and utility concerns.
We propose a new federated multi-task for effective parameter transfer differential privacy to protect at the client level.
We are the first to provide both privacy utility guarantees for such a proposed algorithm.
arXiv Detail & Related papers (2020-11-14T00:53:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.