Federated Deep Subspace Clustering
- URL: http://arxiv.org/abs/2501.00230v2
- Date: Thu, 16 Jan 2025 02:28:47 GMT
- Title: Federated Deep Subspace Clustering
- Authors: Yupei Zhang, Ruojia Feng, Yifei Wang, Xuequn Shang,
- Abstract summary: This paper introduces FDSC, a private-protected subspace clustering (SC) approach with federated learning (FC) schema.
In each client, there is a deep subspace clustering network accounting for grouping the isolated data, composed of a encode network, a self-expressive layer, and a decode network.
Experiments test FDSC on public datasets and compare with other clustering methods, demonstrating the effectiveness of FDSC.
- Score: 14.279959858427778
- License:
- Abstract: This paper introduces FDSC, a private-protected subspace clustering (SC) approach with federated learning (FC) schema. In each client, there is a deep subspace clustering network accounting for grouping the isolated data, composed of a encode network, a self-expressive layer, and a decode network. FDSC is achieved by uploading the encode network to communicate with other clients in the server. Besides, FDSC is also enhanced by preserving the local neighborhood relationship in each client. With the effects of federated learning and locality preservation, the learned data features from the encoder are boosted so as to enhance the self-expressiveness learning and result in better clustering performance. Experiments test FDSC on public datasets and compare with other clustering methods, demonstrating the effectiveness of FDSC.
Related papers
- Decoupled Subgraph Federated Learning [57.588938805581044]
We address the challenge of federated learning on graph-structured data distributed across multiple clients.
We present a novel framework for this scenario, named FedStruct, that harnesses deep structural dependencies.
We validate the effectiveness of FedStruct through experimental results conducted on six datasets for semi-supervised node classification.
arXiv Detail & Related papers (2024-02-29T13:47:23Z) - CCFC: Bridging Federated Clustering and Contrastive Learning [9.91610928326645]
We propose a new federated clustering method named cluster-contrastive federated clustering (CCFC)
CCFC shows superior performance in handling device failures from a practical viewpoint.
arXiv Detail & Related papers (2024-01-12T15:26:44Z) - Personalized Federated Learning with Attention-based Client Selection [57.71009302168411]
We propose FedACS, a new PFL algorithm with an Attention-based Client Selection mechanism.
FedACS integrates an attention mechanism to enhance collaboration among clients with similar data distributions.
Experiments on CIFAR10 and FMNIST validate FedACS's superiority.
arXiv Detail & Related papers (2023-12-23T03:31:46Z) - Privacy-Preserving Federated Deep Clustering based on GAN [12.256298398007848]
We present a novel approach to Federated Deep Clustering based on Generative Adversarial Networks (GANs)
Each client trains a local generative adversarial network (GAN) locally and uploads the synthetic data to the server.
The server applies a deep clustering network on the synthetic data to establish $k$ cluster centroids, which are then downloaded to the clients for cluster assignment.
arXiv Detail & Related papers (2022-11-30T13:20:11Z) - Federated clustering with GAN-based data synthesis [12.256298398007848]
Federated clustering (FC) is an extension of centralized clustering in federated settings.
We propose a new federated clustering framework, named synthetic data aided federated clustering (SDA-FC)
It trains generative adversarial network locally in each client and uploads the generated synthetic data to the server, where KM or FCM is performed on the synthetic data.
The synthetic data can make the model immune to the non-IID problem and enable us to capture the global similarity characteristics more effectively without sharing private data.
arXiv Detail & Related papers (2022-10-29T07:42:11Z) - Efficient Distribution Similarity Identification in Clustered Federated
Learning via Principal Angles Between Client Data Subspaces [59.33965805898736]
Clustered learning has been shown to produce promising results by grouping clients into clusters.
Existing FL algorithms are essentially trying to group clients together with similar distributions.
Prior FL algorithms attempt similarities indirectly during training.
arXiv Detail & Related papers (2022-09-21T17:37:54Z) - Secure Federated Clustering [18.37669220755388]
SecFC is a secure federated clustering algorithm that simultaneously achieves universal performance.
Each client's private data and the cluster centers are not leaked to other clients and the server.
arXiv Detail & Related papers (2022-05-31T06:47:18Z) - On the Convergence of Clustered Federated Learning [57.934295064030636]
In a federated learning system, the clients, e.g. mobile devices and organization participants, usually have different personal preferences or behavior patterns.
This paper proposes a novel weighted client-based clustered FL algorithm to leverage the client's group and each client in a unified optimization framework.
arXiv Detail & Related papers (2022-02-13T02:39:19Z) - Two-Bit Aggregation for Communication Efficient and Differentially
Private Federated Learning [79.66767935077925]
In federated learning (FL), a machine learning model is trained on multiple nodes in a decentralized manner, while keeping the data local and not shared with other nodes.
The information sent from the nodes to the server may reveal some details about each node's local data, thus raising privacy concerns.
A novel two-bit aggregation algorithm is proposed with guaranteed differential privacy and reduced uplink communication overhead.
arXiv Detail & Related papers (2021-10-06T19:03:58Z) - Overcomplete Deep Subspace Clustering Networks [80.16644725886968]
Experimental results on four benchmark datasets show the effectiveness of the proposed method over DSC and other clustering methods in terms of clustering error.
Our method is also not as dependent as DSC is on where pre-training should be stopped to get the best performance and is also more robust to noise.
arXiv Detail & Related papers (2020-11-16T22:07:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.