Byzantine-Robust Clustered Federated Learning
- URL: http://arxiv.org/abs/2306.00638v1
- Date: Thu, 1 Jun 2023 13:01:13 GMT
- Title: Byzantine-Robust Clustered Federated Learning
- Authors: Zhixu Tao, Kun Yang, Sanjeev R. Kulkarni
- Abstract summary: This paper focuses on the problem of adversarial attacks from Byzantine machines in a Federated Learning setting.
In this setting, non-Byzantine machines in the same cluster have the same underlying data distribution, and different clusters of non-Byzantine machines have different learning tasks.
The goal of our work is to identify cluster membership of non-Byzantine machines and optimize the models learned by each cluster.
- Score: 10.503285978504548
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper focuses on the problem of adversarial attacks from Byzantine
machines in a Federated Learning setting where non-Byzantine machines can be
partitioned into disjoint clusters. In this setting, non-Byzantine machines in
the same cluster have the same underlying data distribution, and different
clusters of non-Byzantine machines have different learning tasks. Byzantine
machines can adversarially attack any cluster and disturb the training process
on clusters they attack. In the presence of Byzantine machines, the goal of our
work is to identify cluster membership of non-Byzantine machines and optimize
the models learned by each cluster. We adopt the Iterative Federated Clustering
Algorithm (IFCA) framework of Ghosh et al. (2020) to alternatively estimate
cluster membership and optimize models. In order to make this framework robust
against adversarial attacks from Byzantine machines, we use coordinate-wise
trimmed mean and coordinate-wise median aggregation methods used by Yin et al.
(2018). Specifically, we propose a new Byzantine-Robust Iterative Federated
Clustering Algorithm to improve on the results in Ghosh et al. (2019). We prove
a convergence rate for this algorithm for strongly convex loss functions. We
compare our convergence rate with the convergence rate of an existing
algorithm, and we demonstrate the performance of our algorithm on simulated
data.
Related papers
- Fuzzy K-Means Clustering without Cluster Centroids [79.19713746387337]
Fuzzy K-Means clustering is a critical computation technique in unsupervised data analysis.
This paper proposes a novel Fuzzy K-Means clustering algorithm that entirely eliminates the reliance on cluster centroids.
arXiv Detail & Related papers (2024-04-07T12:25:03Z) - Near-Optimal Resilient Aggregation Rules for Distributed Learning Using 1-Center and 1-Mean Clustering with Outliers [24.88026399458157]
Byzantine machine learning has garnered considerable attention in light of the unpredictable faults that can occur.
The key to secure machines in distributed learning is resilient aggregation mechanisms.
arXiv Detail & Related papers (2023-12-20T08:36:55Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Robust Federated Learning via Over-The-Air Computation [48.47690125123958]
Simple averaging of model updates via over-the-air computation makes the learning task vulnerable to random or intended modifications of the local model updates of some malicious clients.
We propose a robust transmission and aggregation framework to such attacks while preserving the benefits of over-the-air computation for federated learning.
arXiv Detail & Related papers (2021-11-01T19:21:21Z) - Escaping Saddle Points in Distributed Newton's Method with Communication
efficiency and Byzantine Resilience [49.379254729853756]
We consider the problem of optimizing a non-regularized loss function (with saddle points) in a distributed framework in the presence of Byzantine machines.
We robustify the cubic-regularized Newton algorithm such that it avoids the saddle points and the fake local minimas efficiently.
We obtain theoretical guarantees for our proposed scheme under several approximate settings including (sub-sampled) and Hessians.
arXiv Detail & Related papers (2021-03-17T03:53:58Z) - Byzantine-Resilient Non-Convex Stochastic Gradient Descent [61.6382287971982]
adversary-resilient distributed optimization, in which.
machines can independently compute gradients, and cooperate.
Our algorithm is based on a new concentration technique, and its sample complexity.
It is very practical: it improves upon the performance of all prior methods when no.
setting machines are present.
arXiv Detail & Related papers (2020-12-28T17:19:32Z) - Dynamic Clustering in Federated Learning [15.37652170495055]
We propose a three-phased data clustering algorithm, namely: generative adversarial network-based clustering, cluster calibration, and cluster division.
Our algorithm improves the performance of forecasting models, including cellular network handover, by 43%.
arXiv Detail & Related papers (2020-12-07T15:30:07Z) - Joint Optimization of an Autoencoder for Clustering and Embedding [22.16059261437617]
We present an alternative where the autoencoder and the clustering are learned simultaneously.
That simple neural network, referred to as the clustering module, can be integrated into a deep autoencoder resulting in a deep clustering model.
arXiv Detail & Related papers (2020-12-07T14:38:10Z) - A black-box adversarial attack for poisoning clustering [78.19784577498031]
We propose a black-box adversarial attack for crafting adversarial samples to test the robustness of clustering algorithms.
We show that our attacks are transferable even against supervised algorithms such as SVMs, random forests, and neural networks.
arXiv Detail & Related papers (2020-09-09T18:19:31Z) - An Efficient Framework for Clustered Federated Learning [26.24231986590374]
We address the problem of federated learning (FL) where users are distributed into clusters.
We propose the Iterative Federated Clustering Algorithm (IFCA)
We show that our algorithm is efficient in non- partitioned problems such as neural networks.
arXiv Detail & Related papers (2020-06-07T08:48:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.