Towards Federated Clustering: A Federated Fuzzy $c$-Means Algorithm
(FFCM)
- URL: http://arxiv.org/abs/2201.07316v1
- Date: Tue, 18 Jan 2022 21:22:28 GMT
- Title: Towards Federated Clustering: A Federated Fuzzy $c$-Means Algorithm
(FFCM)
- Authors: Morris Stallmann and Anna Wilbik
- Abstract summary: Federated Learning (FL) is a setting where multiple parties with distributed data collaborate in training a joint Machine Learning (ML) model.
We describe how this area of research can be of interest to itself, or how it helps addressing issues like non-independently-identically-distributed (i.i.d.) data.
We propose two methods to calculate global cluster centers and evaluate their behaviour through challenging numerical experiments.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) is a setting where multiple parties with distributed
data collaborate in training a joint Machine Learning (ML) model while keeping
all data local at the parties. Federated clustering is an area of research
within FL that is concerned with grouping together data that is globally
similar while keeping all data local. We describe how this area of research can
be of interest in itself, or how it helps addressing issues like
non-independently-identically-distributed (i.i.d.) data in supervised FL
frameworks. The focus of this work, however, is an extension of the federated
fuzzy $c$-means algorithm to the FL setting (FFCM) as a contribution towards
federated clustering. We propose two methods to calculate global cluster
centers and evaluate their behaviour through challenging numerical experiments.
We observe that one of the methods is able to identify good global clusters
even in challenging scenarios, but also acknowledge that many challenges remain
open.
Related papers
- A Framework for testing Federated Learning algorithms using an edge-like environment [0.0]
Federated Learning (FL) is a machine learning paradigm in which many clients cooperatively train a single centralized model while keeping their data private and decentralized.
It is non-trivial to accurately evaluate the contributions of local models in global centralized model aggregation.
This is an example of a major challenge in FL, commonly known as data imbalance or class imbalance.
In this work, a framework is proposed and implemented to assess FL algorithms in a more easy and scalable way.
arXiv Detail & Related papers (2024-07-17T19:52:53Z) - Federated Learning with Bilateral Curation for Partially Class-Disjoint Data [47.55180390473258]
Partially class-disjoint data (PCDD), a common yet under-explored data formation, severely challenges the performance of federated algorithms.
We propose a novel approach called FedGELA where the ETF is globally fixed as a simplex ETF while locally adapted to the personal distributions.
We conduct extensive experiments on a range of datasets to demonstrate that our FedGELA achieves promising performance.
arXiv Detail & Related papers (2024-05-29T10:34:44Z) - Dynamically Weighted Federated k-Means [0.0]
Federated clustering enables multiple data sources to collaboratively cluster their data, maintaining decentralization and preserving privacy.
We introduce a novel federated clustering algorithm named Dynamically Weighted Federated k-means (DWF k-means) based on Lloyd's method for k-means clustering.
We conduct experiments on multiple datasets and data distribution settings to evaluate the performance of our algorithm in terms of clustering score, accuracy, and v-measure.
arXiv Detail & Related papers (2023-10-23T12:28:21Z) - Find Your Optimal Assignments On-the-fly: A Holistic Framework for
Clustered Federated Learning [5.045379017315639]
Federated Learning (FL) is an emerging distributed machine learning approach that preserves client privacy by storing data on edge devices.
Recent studies have proposed clustering as a solution to tackle client heterogeneity in FL by grouping clients with distribution shifts into different clusters.
This paper presents a comprehensive investigation into current clustered FL methods and proposes a four-tier framework to encompass and extend existing approaches.
arXiv Detail & Related papers (2023-10-09T04:23:11Z) - Federated K-means Clustering [0.0]
Federated learning is a technique that enables the use of distributed datasets for machine learning purposes without requiring data to be pooled.
This work introduces an algorithm which implements K-means clustering in a federated manner.
arXiv Detail & Related papers (2023-10-02T13:32:00Z) - Efficient Distribution Similarity Identification in Clustered Federated
Learning via Principal Angles Between Client Data Subspaces [59.33965805898736]
Clustered learning has been shown to produce promising results by grouping clients into clusters.
Existing FL algorithms are essentially trying to group clients together with similar distributions.
Prior FL algorithms attempt similarities indirectly during training.
arXiv Detail & Related papers (2022-09-21T17:37:54Z) - FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for
Federated Learning on Non-IID Data [69.0785021613868]
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
We propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies.
This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT)
arXiv Detail & Related papers (2022-05-19T03:32:03Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Multi-Center Federated Learning [62.32725938999433]
Federated learning (FL) can protect data privacy in distributed learning.
It merely collects local gradients from users without access to their data.
We propose a novel multi-center aggregation mechanism.
arXiv Detail & Related papers (2021-08-19T12:20:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.