Weighted K-Harmonic Means Clustering: Convergence Analysis and Applications to Wireless Communications
- URL: http://arxiv.org/abs/2512.16185v1
- Date: Thu, 18 Dec 2025 05:09:56 GMT
- Title: Weighted K-Harmonic Means Clustering: Convergence Analysis and Applications to Wireless Communications
- Authors: Gourab Ghatak,
- Abstract summary: We propose the emphmph K-harmonic means (WKHM) clustering, a regularized variant of K-harmonic means designed to ensure numerical stability.<n>We show WKHM achieves a superior minimum signal strength and principled clustering baselines, making it a tool for radio node placement and user association in wireless networks.
- Score: 7.586829158414637
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose the \emph{weighted K-harmonic means} (WKHM) clustering algorithm, a regularized variant of K-harmonic means designed to ensure numerical stability while enabling soft assignments through inverse-distance weighting. Unlike classical K-means and constrained K-means, WKHM admits a direct interpretation in wireless networks: its weights are exactly equivalent to fractional user association based on received signal strength. We establish rigorous convergence guarantees under both deterministic and stochastic settings, addressing key technical challenges arising from non-convexity and random initialization. Specifically, we prove monotone descent to a local minimum under fixed initialization, convergence in probability under Binomial Point Process (BPP) initialization, and almost sure convergence under mild decay conditions. These results provide the first stochastic convergence guarantees for harmonic-mean-based clustering. Finally, through extensive simulations with diverse user distributions, we show that WKHM achieves a superior tradeoff between minimum signal strength and load fairness compared to classical and modern clustering baselines, making it a principled tool for joint radio node placement and user association in wireless networks.
Related papers
- K-Means as a Radial Basis function Network: a Variational and Gradient-based Equivalence [41.99844472131922]
This work establishes a rigorous variational and gradient-based equivalence between the classical K-Means algorithm and differentiable Basis Radial Function neural networks.<n>We show that the RBF objective $$-converges to the K-Means solution as the temperature parameter $$ vanishes.<n>We also propose the integration of Ent-max-1.5, which ensures stable convergence while preserving the underlying Voronoi partition structure.
arXiv Detail & Related papers (2026-03-04T21:41:50Z) - Stability and Generalization of Push-Sum Based Decentralized Optimization over Directed Graphs [55.77845440440496]
Push-based decentralized communication enables optimization over communication networks, where information exchange may be asymmetric.<n>We develop a unified uniform-stability framework for the Gradient Push (SGP) algorithm.<n>A key technical ingredient is an imbalance-aware generalization bound through two quantities.
arXiv Detail & Related papers (2026-02-24T05:32:03Z) - Statistical Inference for Fuzzy Clustering [7.416766339318596]
Fuzzy $c$-means (FCM) allow mixed memberships and better capture uncertainty and gradual transitions.<n>We develop a new framework for weighted fuzzy $c$-means (WFCM) for settings with potential cluster size imbalance.
arXiv Detail & Related papers (2026-01-06T02:11:01Z) - Silhouette-Guided Instance-Weighted k-means [2.56711111236449]
K-Sil is a silhouette-guided refinement of the k-means algorithm that weights points based on their silhouette scores.<n>It prioritizes well-clustered instances while suppressing borderline or noisy regions.<n>These results establish K-Sil as a principled alternative for applications demanding high-quality, well-separated clusters.
arXiv Detail & Related papers (2025-06-15T15:09:05Z) - K*-Means: A Parameter-free Clustering Algorithm [55.20132267309382]
k*-means is a novel clustering algorithm that eliminates the need to set k or any other parameters.<n>It uses the minimum description length principle to automatically determine the optimal number of clusters, k*, by splitting and merging clusters.<n>We prove that k*-means is guaranteed to converge and demonstrate experimentally that it significantly outperforms existing methods in scenarios where k is unknown.
arXiv Detail & Related papers (2025-05-17T08:41:07Z) - Random Normed k-Means: A Paradigm-Shift in Clustering within Probabilistic Metric Spaces [0.7864304771129751]
We introduce the first k-means variant in the literature that operates within a probabilistic metric space.<n>By adopting a probabilistic perspective, our method not only introduces a fresh paradigm but also establishes a rigorous theoretical framework.<n>Our proposed random normed k-means (RNKM) algorithm exhibits a remarkable ability to identify nonlinearly separable structures.
arXiv Detail & Related papers (2025-04-04T20:48:43Z) - Interaction-Aware Gaussian Weighting for Clustered Federated Learning [58.92159838586751]
Federated Learning (FL) emerged as a decentralized paradigm to train models while preserving privacy.<n>We propose a novel clustered FL method, FedGWC (Federated Gaussian Weighting Clustering), which groups clients based on their data distribution.<n>Our experiments on benchmark datasets show that FedGWC outperforms existing FL algorithms in cluster quality and classification accuracy.
arXiv Detail & Related papers (2025-02-05T16:33:36Z) - Self-Supervised Graph Embedding Clustering [70.36328717683297]
K-means one-step dimensionality reduction clustering method has made some progress in addressing the curse of dimensionality in clustering tasks.
We propose a unified framework that integrates manifold learning with K-means, resulting in the self-supervised graph embedding framework.
arXiv Detail & Related papers (2024-09-24T08:59:51Z) - Adaptively Robust and Sparse K-means Clustering [5.535948428518607]
This paper proposes adaptively robust and sparse K-means clustering (ARSK) to address these practical limitations of the standard K-means algorithm.
For robustness, we introduce a redundant error component for each observation, and this additional parameter is penalized using a group sparse penalty.
To accommodate the impact of high-dimensional noisy variables, the objective function is modified by incorporating weights and implementing a penalty to control the sparsity of the weight vector.
arXiv Detail & Related papers (2024-07-09T15:20:41Z) - Fuzzy K-Means Clustering without Cluster Centroids [21.256564324236333]
Fuzzy K-Means clustering is a critical technique in unsupervised data analysis.
This paper proposes a novel Fuzzy textitK-Means clustering algorithm that entirely eliminates the reliance on cluster centroids.
arXiv Detail & Related papers (2024-04-07T12:25:03Z) - Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks [60.09912912343705]
This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
arXiv Detail & Related papers (2024-03-05T17:49:09Z) - Wireless Federated $k$-Means Clustering with Non-coherent Over-the-Air
Computation [14.087062902871212]
OAC scheme relies on an encoder exploiting the representation of a number in a balanced number system.
Reinitialization method for ineffectively used centroids is proposed to improve the performance of the proposed method for heterogeneous data distribution.
arXiv Detail & Related papers (2023-08-11T20:12:26Z) - On the Convergence of Stochastic Extragradient for Bilinear Games with
Restarted Iteration Averaging [96.13485146617322]
We present an analysis of the ExtraGradient (SEG) method with constant step size, and present variations of the method that yield favorable convergence.
We prove that when augmented with averaging, SEG provably converges to the Nash equilibrium, and such a rate is provably accelerated by incorporating a scheduled restarting procedure.
arXiv Detail & Related papers (2021-06-30T17:51:36Z) - An improved spectral clustering method for community detection under the
degree-corrected stochastic blockmodel [1.0965065178451106]
We propose an improved spectral clustering (ISC) approach under the degree corrected block model (SBM)
ISC provides a significant improvement on two weak signal networks Simmons and Caltech, with error rates of 121/1137 and 96/590, respectively.
arXiv Detail & Related papers (2020-11-12T13:35:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.