A Federated Generalized Expectation-Maximization Algorithm for Mixture Models with an Unknown Number of Components
- URL: http://arxiv.org/abs/2601.21160v1
- Date: Thu, 29 Jan 2026 01:51:14 GMT
- Title: A Federated Generalized Expectation-Maximization Algorithm for Mixture Models with an Unknown Number of Components
- Authors: Michael Ibrahim, Nagi Gebraeel, Weijun Xie,
- Abstract summary: FedGEM is a generalized expectation-maximization algorithm for the training of mixture models with an unknown number of components.<n>Our proposed algorithm relies on each of the clients performing EM steps locally, and constructing an uncertainty set around the maximizer associated with each local component.<n>The central server utilizes the uncertainty sets to learn potential cluster overlaps between clients, and infer the global number of clusters.
- Score: 4.369474715585884
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the problem of federated clustering when the total number of clusters $K$ across clients is unknown, and the clients have heterogeneous but potentially overlapping cluster sets in their local data. To that end, we develop FedGEM: a federated generalized expectation-maximization algorithm for the training of mixture models with an unknown number of components. Our proposed algorithm relies on each of the clients performing EM steps locally, and constructing an uncertainty set around the maximizer associated with each local component. The central server utilizes the uncertainty sets to learn potential cluster overlaps between clients, and infer the global number of clusters via closed-form computations. We perform a thorough theoretical study of our algorithm, presenting probabilistic convergence guarantees under common assumptions. Subsequently, we study the specific setting of isotropic GMMs, providing tractable, low-complexity computations to be performed by each client during each iteration of the algorithm, as well as rigorously verifying assumptions required for algorithm convergence. We perform various numerical experiments, where we empirically demonstrate that our proposed method achieves comparable performance to centralized EM, and that it outperforms various existing federated clustering methods.
Related papers
- A Generic Framework for Fair Consensus Clustering in Streams [1.6398837165722515]
We introduce a new generic algorithmic framework that integrates closest fair clustering with cluster fitting.<n>We extend our methods to the more general k-median consensus clustering problem.
arXiv Detail & Related papers (2026-02-12T02:52:07Z) - One-Shot Hierarchical Federated Clustering [51.490181220883905]
This paper introduces an efficient one-shot hierarchical Federated Clustering framework.<n>It performs client-end distribution exploration and server-end distribution aggregation.<n>It turns out that the complex cluster distributions across clients can be efficiently explored.
arXiv Detail & Related papers (2026-01-10T02:58:33Z) - Federated Multi-Task Clustering [44.73672172790804]
This paper proposes a novel framework named Federated Multi-Task Clustering (i.e.,FMTC)<n>It is composed of two main components: client-side personalized clustering module and server-side tensorial correlation module.<n>We derive an efficient, privacy-preserving distributed algorithm based on the Alternating Direction Method of Multipliers.
arXiv Detail & Related papers (2025-12-28T12:02:32Z) - Parameter-Free Clustering via Self-Supervised Consensus Maximization (Extended Version) [50.41628860536753]
We propose a novel and fully parameter-free clustering framework via Self-supervised Consensus Maximization, named SCMax.<n>Our framework performs hierarchical agglomerative clustering and cluster evaluation in a single, integrated process.
arXiv Detail & Related papers (2025-11-12T11:17:17Z) - Differentially Private Federated Clustering with Random Rebalancing [9.331231828491461]
Federated clustering aims to group similar clients into clusters and produce one model for each cluster.<n>We propose RR-Cluster, that can be viewed as a light-weight add-on to many federated clustering algorithms.<n>We analyze the tradeoffs between decreased privacy noise variance and potentially increased bias from incorrect assignments.
arXiv Detail & Related papers (2025-08-08T09:56:47Z) - Dynamically Weighted Federated k-Means [0.0]
Federated clustering enables multiple data sources to collaboratively cluster their data, maintaining decentralization and preserving privacy.
We introduce a novel federated clustering algorithm named Dynamically Weighted Federated k-means (DWF k-means) based on Lloyd's method for k-means clustering.
We conduct experiments on multiple datasets and data distribution settings to evaluate the performance of our algorithm in terms of clustering score, accuracy, and v-measure.
arXiv Detail & Related papers (2023-10-23T12:28:21Z) - Federated cINN Clustering for Accurate Clustered Federated Learning [33.72494731516968]
Federated Learning (FL) presents an innovative approach to privacy-preserving distributed machine learning.
We propose the Federated cINN Clustering Algorithm (FCCA) to robustly cluster clients into different groups.
arXiv Detail & Related papers (2023-09-04T10:47:52Z) - A One-shot Framework for Distributed Clustered Learning in Heterogeneous
Environments [54.172993875654015]
The paper proposes a family of communication efficient methods for distributed learning in heterogeneous environments.
One-shot approach, based on local computations at the users and a clustering based aggregation step at the server is shown to provide strong learning guarantees.
For strongly convex problems it is shown that, as long as the number of data points per user is above a threshold, the proposed approach achieves order-optimal mean-squared error rates in terms of the sample size.
arXiv Detail & Related papers (2022-09-22T09:04:10Z) - On the Convergence of Clustered Federated Learning [57.934295064030636]
In a federated learning system, the clients, e.g. mobile devices and organization participants, usually have different personal preferences or behavior patterns.
This paper proposes a novel weighted client-based clustered FL algorithm to leverage the client's group and each client in a unified optimization framework.
arXiv Detail & Related papers (2022-02-13T02:39:19Z) - Personalized Federated Learning via Convex Clustering [72.15857783681658]
We propose a family of algorithms for personalized federated learning with locally convex user costs.
The proposed framework is based on a generalization of convex clustering in which the differences between different users' models are penalized.
arXiv Detail & Related papers (2022-02-01T19:25:31Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.