Dynamic Clustering in Federated Learning
- URL: http://arxiv.org/abs/2012.03788v1
- Date: Mon, 7 Dec 2020 15:30:07 GMT
- Title: Dynamic Clustering in Federated Learning
- Authors: Yeongwoo Kim, Ezeddin Al Hakim, Johan Haraldson, Henrik Eriksson,
Jos\'e Mairton B. da Silva Jr., Carlo Fischione
- Abstract summary: We propose a three-phased data clustering algorithm, namely: generative adversarial network-based clustering, cluster calibration, and cluster division.
Our algorithm improves the performance of forecasting models, including cellular network handover, by 43%.
- Score: 15.37652170495055
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In the resource management of wireless networks, Federated Learning has been
used to predict handovers. However, non-independent and identically distributed
data degrade the accuracy performance of such predictions. To overcome the
problem, Federated Learning can leverage data clustering algorithms and build a
machine learning model for each cluster. However, traditional data clustering
algorithms, when applied to the handover prediction, exhibit three main
limitations: the risk of data privacy breach, the fixed shape of clusters, and
the non-adaptive number of clusters. To overcome these limitations, in this
paper, we propose a three-phased data clustering algorithm, namely: generative
adversarial network-based clustering, cluster calibration, and cluster
division. We show that the generative adversarial network-based clustering
preserves privacy. The cluster calibration deals with dynamic environments by
modifying clusters. Moreover, the divisive clustering explores the different
number of clusters by repeatedly selecting and dividing a cluster into multiple
clusters. A baseline algorithm and our algorithm are tested on a time series
forecasting task. We show that our algorithm improves the performance of
forecasting models, including cellular network handover, by 43%.
Related papers
- A3S: A General Active Clustering Method with Pairwise Constraints [66.74627463101837]
A3S features strategic active clustering adjustment on the initial cluster result, which is obtained by an adaptive clustering algorithm.
In extensive experiments across diverse real-world datasets, A3S achieves desired results with significantly fewer human queries.
arXiv Detail & Related papers (2024-07-14T13:37:03Z) - Dynamically Weighted Federated k-Means [0.0]
Federated clustering enables multiple data sources to collaboratively cluster their data, maintaining decentralization and preserving privacy.
We introduce a novel federated clustering algorithm named Dynamically Weighted Federated k-means (DWF k-means) based on Lloyd's method for k-means clustering.
We conduct experiments on multiple datasets and data distribution settings to evaluate the performance of our algorithm in terms of clustering score, accuracy, and v-measure.
arXiv Detail & Related papers (2023-10-23T12:28:21Z) - Privacy-preserving Continual Federated Clustering via Adaptive Resonance
Theory [11.190614418770558]
In the clustering domain, various algorithms with a federated learning framework (i.e., federated clustering) have been actively studied.
This paper proposes a privacy-preserving continual federated clustering algorithm.
Experimental results with synthetic and real-world datasets show that the proposed algorithm has superior clustering performance.
arXiv Detail & Related papers (2023-09-07T05:45:47Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - ClusterNet: A Perception-Based Clustering Model for Scattered Data [16.326062082938215]
Cluster separation in scatterplots is a task that is typically tackled by widely used clustering techniques.
We propose a learning strategy which directly operates on scattered data.
We train ClusterNet, a point-based deep learning model, trained to reflect human perception of cluster separability.
arXiv Detail & Related papers (2023-04-27T13:41:12Z) - Hard Regularization to Prevent Deep Online Clustering Collapse without
Data Augmentation [65.268245109828]
Online deep clustering refers to the joint use of a feature extraction network and a clustering model to assign cluster labels to each new data point or batch as it is processed.
While faster and more versatile than offline methods, online clustering can easily reach the collapsed solution where the encoder maps all inputs to the same point and all are put into a single cluster.
We propose a method that does not require data augmentation, and that, differently from existing methods, regularizes the hard assignments.
arXiv Detail & Related papers (2023-03-29T08:23:26Z) - Deep Clustering: A Comprehensive Survey [53.387957674512585]
Clustering analysis plays an indispensable role in machine learning and data mining.
Deep clustering, which can learn clustering-friendly representations using deep neural networks, has been broadly applied in a wide range of clustering tasks.
Existing surveys for deep clustering mainly focus on the single-view fields and the network architectures, ignoring the complex application scenarios of clustering.
arXiv Detail & Related papers (2022-10-09T02:31:32Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Learning Statistical Representation with Joint Deep Embedded Clustering [2.1267423178232407]
StatDEC is an unsupervised framework for joint statistical representation learning and clustering.
Our experiments show that using these representations, one can considerably improve results on imbalanced image clustering across a variety of image datasets.
arXiv Detail & Related papers (2021-09-11T09:26:52Z) - Decorrelating Adversarial Nets for Clustering Mobile Network Data [0.7034976835586089]
A subset of deep learning, deep clustering could be a valuable tool for many network automation use-cases.
Most state-of-the-art clustering algorithms target image datasets, which makes them hard to apply to mobile network data.
We propose a new algorithm, DANCE, intended to be a reliable deep clustering method which also performs well when applied to network automation use-cases.
arXiv Detail & Related papers (2021-03-11T15:26:26Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.