FEDEXCHANGE: Bridging the Domain Gap in Federated Object Detection for Free
- URL: http://arxiv.org/abs/2509.10503v1
- Date: Mon, 01 Sep 2025 17:39:25 GMT
- Title: FEDEXCHANGE: Bridging the Domain Gap in Federated Object Detection for Free
- Authors: Haolin Yuan, Jingtao Li, Weiming Zhuang, Chen Chen, Lingjuan Lyu,
- Abstract summary: Federated Object Detection (FOD) enables clients to collaboratively train a global object detection model without accessing their local data from diverse domains.<n>Existing FOD methods often overlook the hardware constraints of edge devices and introduce local training regularizations that incur high computational costs.<n>We propose FEDEXCHANGE, a novel FOD framework that bridges domain gaps without introducing additional local computational overhead.
- Score: 58.34974215853841
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Object Detection (FOD) enables clients to collaboratively train a global object detection model without accessing their local data from diverse domains. However, significant variations in environment, weather, and other domain specific factors hinder performance, making cross domain generalization a key challenge. Existing FOD methods often overlook the hardware constraints of edge devices and introduce local training regularizations that incur high computational costs, limiting real-world applicability. In this paper, we propose FEDEXCHANGE, a novel FOD framework that bridges domain gaps without introducing additional local computational overhead. FEDEXCHANGE employs a server side dynamic model exchange strategy that enables each client to gain insights from other clients' domain data without direct data sharing. Specifically, FEDEXCHANGE allows the server to alternate between model aggregation and model exchange. During aggregation rounds, the server aggregates all local models as usual. In exchange rounds, FEDEXCHANGE clusters and exchanges local models based on distance measures, allowing local models to learn from a variety of domains. As all operations are performed on the server side, clients can achieve improved cross domain utility without any additional computational overhead. Extensive evaluations demonstrate that FEDEXCHANGE enhances FOD performance, achieving 1.6X better mean average precision in challenging domains, such as rainy conditions, while requiring only 0.8X the computational resources compared to baseline methods.
Related papers
- FeDecider: An LLM-Based Framework for Federated Cross-Domain Recommendation [75.50721642765994]
Large language model (LLM)-based recommendation models have demonstrated impressive performance.<n>We propose an LLM-based framework for Federated cross-domain recommendation, FeDecider.<n>Extensive experiments across diverse datasets validate the effectiveness of our proposed FeDecider.
arXiv Detail & Related papers (2026-02-17T21:42:28Z) - Federated Domain Generalization with Latent Space Inversion [42.37530136140609]
Federated domain generalization (FedDG) addresses distribution shifts among clients in a federated learning framework.<n>FedDG methods aggregate the parameters of locally trained client models to form a global model that generalizes to unseen clients while preserving data privacy.<n>Our solution addresses this problem by contributing new ways to perform local client training and model aggregation.
arXiv Detail & Related papers (2025-12-11T02:17:03Z) - Learning Latent Spaces for Domain Generalization in Time Series Forecasting [60.29403194508811]
Time series forecasting is vital in many real-world applications, yet developing models that generalize well on unseen relevant domains remains underexplored.<n>We propose a framework for domain generalization in time series forecasting by mining the latent factors that govern temporal dependencies across domains.<n>Our approach uses a decomposition-based architecture with a new Conditional $beta$-Variational Autoencoder (VAE), wherein time series data is first decomposed into trend-cyclical and seasonal components.
arXiv Detail & Related papers (2024-12-15T12:41:53Z) - PARDON: Privacy-Aware and Robust Federated Domain Generalization [5.584498171854557]
Federated Learning (FL) shows promise in preserving privacy and enabling collaborative learning.<n>A significant challenge arises when client data comes from diverse domains, leading to poor performance on unseen domains.<n>Existing Federated Domain Generalization approaches address this problem but assume each client holds data for an entire domain.<n>We introduce FISC, a novel FedDG paradigm designed to robustly handle more complicated domain distributions.
arXiv Detail & Related papers (2024-10-30T00:50:23Z) - FedCCRL: Federated Domain Generalization with Cross-Client Representation Learning [4.703379311088474]
Domain Generalization (DG) aims to train models that can effectively generalize to unseen domains.
In Federated Learning (FL), where clients collaboratively train a model without directly sharing their data, most existing DG algorithms are not directly applicable to the FL setting.
We propose FedCCRL, a lightweight federated domain generalization method that significantly improves the model's generalization ability while preserving privacy.
arXiv Detail & Related papers (2024-10-15T04:44:21Z) - Feature Diversification and Adaptation for Federated Domain Generalization [27.646565383214227]
In real-world applications, local clients often operate within their limited domains, leading to a domain shift' across clients.
We introduce the concept of federated feature diversification, which helps local models learn client-invariant representations while preserving privacy.
Our resultant global model shows robust performance on unseen test domain data.
arXiv Detail & Related papers (2024-07-11T07:45:10Z) - Federated and Generalized Person Re-identification through Domain and
Feature Hallucinating [88.77196261300699]
We study the problem of federated domain generalization (FedDG) for person re-identification (re-ID)
We propose a novel method, called "Domain and Feature Hallucinating (DFH)", to produce diverse features for learning generalized local and global models.
Our method achieves the state-of-the-art performance for FedDG on four large-scale re-ID benchmarks.
arXiv Detail & Related papers (2022-03-05T09:15:13Z) - Federated Multi-Target Domain Adaptation [99.93375364579484]
Federated learning methods enable us to train machine learning models on distributed user data while preserving its privacy.
We consider a more practical scenario where the distributed client data is unlabeled, and a centralized labeled dataset is available on the server.
We propose an effective DualAdapt method to address the new challenges.
arXiv Detail & Related papers (2021-08-17T17:53:05Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.