Balanced Multi-modal Federated Learning via Cross-Modal Infiltration
- URL: http://arxiv.org/abs/2401.00894v1
- Date: Sun, 31 Dec 2023 05:50:15 GMT
- Title: Balanced Multi-modal Federated Learning via Cross-Modal Infiltration
- Authors: Yunfeng Fan, Wenchao Xu, Haozhao Wang, Jiaqi Zhu, and Song Guo
- Abstract summary: Federated learning (FL) underpins advancements in privacy-preserving distributed computing.
We propose a novel Cross-Modal Infiltration Federated Learning (FedCMI) framework.
- Score: 19.513099949266156
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) underpins advancements in privacy-preserving
distributed computing by collaboratively training neural networks without
exposing clients' raw data. Current FL paradigms primarily focus on uni-modal
data, while exploiting the knowledge from distributed multimodal data remains
largely unexplored. Existing multimodal FL (MFL) solutions are mainly designed
for statistical or modality heterogeneity from the input side, however, have
yet to solve the fundamental issue,"modality imbalance", in distributed
conditions, which can lead to inadequate information exploitation and
heterogeneous knowledge aggregation on different modalities.In this paper, we
propose a novel Cross-Modal Infiltration Federated Learning (FedCMI) framework
that effectively alleviates modality imbalance and knowledge heterogeneity via
knowledge transfer from the global dominant modality. To avoid the loss of
information in the weak modality due to merely imitating the behavior of
dominant modality, we design the two-projector module to integrate the
knowledge from dominant modality while still promoting the local feature
exploitation of weak modality. In addition, we introduce a class-wise
temperature adaptation scheme to achieve fair performance across different
classes. Extensive experiments over popular datasets are conducted and give us
a gratifying confirmation of the proposed framework for fully exploring the
information of each modality in MFL.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.