Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments
- URL: http://arxiv.org/abs/2505.19699v1
- Date: Mon, 26 May 2025 08:52:49 GMT
- Title: Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments
- Authors: Junming Liu, Yanting Gao, Siyuan Meng, Yifei Sun, Aoqi Wu, Yufei Jin, Yirong Chen, Ding Wang, Guosun Zeng,
- Abstract summary: Federated Learning (FL) is a decentralized machine learning paradigm that enables clients to collaboratively train models while preserving data privacy.<n>We propose Mosaic, a novel data-free knowledge distillation framework tailored for heterogeneous distributed environments.<n>Mosaic forms a Mixture-of-Experts (MoE) from client models based on their specialized knowledge, and distills it into a global model using the generated data.
- Score: 8.494154839146622
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) is a decentralized machine learning paradigm that enables clients to collaboratively train models while preserving data privacy. However, the coexistence of model and data heterogeneity gives rise to inconsistent representations and divergent optimization dynamics across clients, ultimately hindering robust global performance. To transcend these challenges, we propose Mosaic, a novel data-free knowledge distillation framework tailored for heterogeneous distributed environments. Mosaic first trains local generative models to approximate each client's personalized distribution, enabling synthetic data generation that safeguards privacy through strict separation from real data. Subsequently, Mosaic forms a Mixture-of-Experts (MoE) from client models based on their specialized knowledge, and distills it into a global model using the generated data. To further enhance the MoE architecture, Mosaic integrates expert predictions via a lightweight meta model trained on a few representative prototypes. Extensive experiments on standard image classification benchmarks demonstrate that Mosaic consistently outperforms state-of-the-art approaches under both model and data heterogeneity. The source code has been published at https://github.com/Wings-Of-Disaster/Mosaic.
Related papers
- Boosting Generalization Performance in Model-Heterogeneous Federated Learning Using Variational Transposed Convolution [0.27309692684728615]
Federated learning (FL) is a pioneering machine learning paradigm that enables distributed clients to process local data effectively.<n>Traditional model-homogeneous approaches mainly involve debiasing the local training procedures with regularization or dynamically adjusting client weights in aggregation.<n>We propose a model-heterogeneous FL framework that can improve clients' generalization performance over unseen data without model aggregation.
arXiv Detail & Related papers (2025-08-03T08:55:18Z) - Federated Gaussian Mixture Models [0.0]
FedGenGMM is a novel one-shot federated learning approach for unsupervised learning scenarios.<n>It allows local GMM models, trained independently on client devices, to be aggregated through a single communication round.<n>It consistently achieves performance comparable to non-federated and iterative federated methods.
arXiv Detail & Related papers (2025-06-02T15:23:53Z) - Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Not All Clients Are Equal: Personalized Federated Learning on Heterogeneous Multi-Modal Clients [52.14230635007546]
Foundation models have shown remarkable capabilities across diverse multi-modal tasks, but their centralized training raises privacy concerns and induces high transmission costs.<n>For the growing demand for personalizing AI models for different user purposes, personalized federated learning (PFL) has emerged.<n>PFL allows each client to leverage the knowledge of other clients for further adaptation to individual user preferences, again without the need to share data.
arXiv Detail & Related papers (2025-05-20T09:17:07Z) - FedMoE: Personalized Federated Learning via Heterogeneous Mixture of Experts [4.412721048192925]
We present FedMoE, the efficient personalized Federated Learning framework to address data heterogeneity.
FedMoE is composed of two fine-tuning stages. In the first stage, FedMoE simplifies the problem by conducting a search based on observed activation patterns.
In the second stage, these submodels are distributed to clients for further training and returned for server aggregating.
arXiv Detail & Related papers (2024-08-21T03:16:12Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - pFedMoE: Data-Level Personalization with Mixture of Experts for
Model-Heterogeneous Personalized Federated Learning [35.72303739409116]
We propose a model-heterogeneous personalized Federated learning with Mixture of Experts (pFedMoE) method.
It assigns a shared homogeneous small feature extractor and a local gating network for each client's local heterogeneous large model.
Overall, pFedMoE enhances local model personalization at a fine-grained data level.
arXiv Detail & Related papers (2024-02-02T12:09:20Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - One-Shot Heterogeneous Federated Learning with Local Model-Guided Diffusion Models [40.83058938096914]
FedLMG is a one-shot Federated learning method with Local Model-Guided diffusion models.<n>Clients do not need access to any foundation models but only train and upload their local models.
arXiv Detail & Related papers (2023-11-15T11:11:25Z) - Federated Learning with Neural Graphical Models [2.2721854258621064]
Federated Learning (FL) addresses the need to create models based on proprietary data.
We develop a FL framework which maintains a global NGM model that learns the averaged information from the local NGM models.
We experimentally demonstrated the use of FedNGMs for extracting insights from CDC's Infant Mortality dataset.
arXiv Detail & Related papers (2023-09-20T23:24:22Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.