MQFL-FHE: Multimodal Quantum Federated Learning Framework with Fully Homomorphic Encryption
- URL: http://arxiv.org/abs/2412.01858v4
- Date: Fri, 14 Feb 2025 09:45:24 GMT
- Title: MQFL-FHE: Multimodal Quantum Federated Learning Framework with Fully Homomorphic Encryption
- Authors: Siddhant Dutta, Nouhaila Innan, Sadok Ben Yahia, Muhammad Shafique, David Esteban Bernal Neira,
- Abstract summary: homomorphic encryption (FHE) in federated learning (FL) has led to significant advances in data privacy.<n>We propose a novel multimodal quantum federated learning framework that utilizes quantum computing to counteract the performance drop resulting from FHE.<n>Our results also demonstrate that the quantum-enhanced approach mitigates the performance degradation associated with FHE and improves classification accuracy across diverse datasets, validating the potential of quantum interventions in enhancing privacy in FL.
- Score: 5.713063730561454
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The integration of fully homomorphic encryption (FHE) in federated learning (FL) has led to significant advances in data privacy. However, during the aggregation phase, it often results in performance degradation of the aggregated model, hindering the development of robust representational generalization. In this work, we propose a novel multimodal quantum federated learning framework that utilizes quantum computing to counteract the performance drop resulting from FHE. For the first time in FL, our framework combines a multimodal quantum mixture of experts (MQMoE) model with FHE, incorporating multimodal datasets for enriched representation and task-specific learning. Our MQMoE framework enhances performance on multimodal datasets and combined genomics and brain MRI scans, especially for underrepresented categories. Our results also demonstrate that the quantum-enhanced approach mitigates the performance degradation associated with FHE and improves classification accuracy across diverse datasets, validating the potential of quantum interventions in enhancing privacy in FL.
Related papers
- Outlier-Robust Multi-Model Fitting on Quantum Annealers [29.24367815462826]
Multi-model fitting (MMF) presents a significant challenge in Computer Vision.
Existing quantum-based approaches for model fitting are either limited to a single model or consider multi-model scenarios within outlier-free datasets.
This paper introduces a novel approach, the robust quantum multi-model fitting (R-QuMF) algorithm to handle outliers effectively.
arXiv Detail & Related papers (2025-04-18T17:59:53Z) - ClusMFL: A Cluster-Enhanced Framework for Modality-Incomplete Multimodal Federated Learning in Brain Imaging Analysis [28.767460351377462]
In the context of brain imaging analysis, modality incompleteness presents a significant challenge.
We propose ClusMFL, a novel MFL framework that leverages feature clustering for cross-institutional brain imaging analysis.
ClusMFL achieves state-of-the-art performance compared to various baseline methods across varying levels of modality incompleteness.
arXiv Detail & Related papers (2025-02-14T09:33:59Z) - FedMLLM: Federated Fine-tuning MLLM on Multimodal Heterogeneity Data [64.50893177169996]
Fine-tuning Multimodal Large Language Models (MLLMs) with Federated Learning (FL) allows for expanding the training data scope by including private data sources.
We introduce a benchmark for evaluating various downstream tasks in the federated fine-tuning of MLLMs within multimodal heterogeneous scenarios.
We develop a general FedMLLM framework that integrates four representative FL methods alongside two modality-agnostic strategies.
arXiv Detail & Related papers (2024-11-22T04:09:23Z) - FactorLLM: Factorizing Knowledge via Mixture of Experts for Large Language Models [50.331708897857574]
We introduce FactorLLM, a novel approach that decomposes well-trained dense FFNs into sparse sub-networks without requiring any further modifications.
FactorLLM achieves comparable performance to the source model securing up to 85% model performance while obtaining over a 30% increase in inference speed.
arXiv Detail & Related papers (2024-08-15T16:45:16Z) - Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts [54.529880848937104]
We develop a unified MLLM with the MoE architecture, named Uni-MoE, that can handle a wide array of modalities.
Specifically, it features modality-specific encoders with connectors for a unified multimodal representation.
We evaluate the instruction-tuned Uni-MoE on a comprehensive set of multimodal datasets.
arXiv Detail & Related papers (2024-05-18T12:16:01Z) - FedQNN: Federated Learning using Quantum Neural Networks [3.9554540293311864]
This study explores the innovative domain of Quantum Federated Learning (QFL) as a framework for training Quantum Machine Learning (QML) models via distributed networks.
Our proposed Federated Quantum Neural Network (FedQNN) framework emerges as a cutting-edge solution, integrating the singular characteristics of QML with the principles of classical federated learning.
arXiv Detail & Related papers (2024-03-16T08:58:03Z) - Cross-Modal Prototype based Multimodal Federated Learning under Severely
Missing Modality [31.727012729846333]
Multimodal Federated Cross Prototype Learning (MFCPL) is a novel approach for MFL under severely missing modalities.
MFCPL provides diverse modality knowledge in modality-shared level with the cross-modal regularization and modality-specific level with cross-modal contrastive mechanism.
Our approach introduces the cross-modal alignment to provide regularization for modality-specific features, thereby enhancing overall performance.
arXiv Detail & Related papers (2024-01-25T02:25:23Z) - Balanced Multi-modal Federated Learning via Cross-Modal Infiltration [19.513099949266156]
Federated learning (FL) underpins advancements in privacy-preserving distributed computing.
We propose a novel Cross-Modal Infiltration Federated Learning (FedCMI) framework.
arXiv Detail & Related papers (2023-12-31T05:50:15Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Learning Multiscale Consistency for Self-supervised Electron Microscopy
Instance Segmentation [48.267001230607306]
We propose a pretraining framework that enhances multiscale consistency in EM volumes.
Our approach leverages a Siamese network architecture, integrating strong and weak data augmentations.
It effectively captures voxel and feature consistency, showing promise for learning transferable representations for EM analysis.
arXiv Detail & Related papers (2023-08-19T05:49:13Z) - Unimodal Training-Multimodal Prediction: Cross-modal Federated Learning
with Hierarchical Aggregation [16.308470947384134]
HA-Fedformer is a novel transformer-based model that empowers unimodal training with only a unimodal dataset at the client.
We develop an uncertainty-aware aggregation method for the local encoders with layer-wise Markov Chain Monte Carlo sampling.
Our experiments on popular sentiment analysis benchmarks, CMU-MOSI and CMU-MOSEI, demonstrate that HA-Fedformer significantly outperforms state-of-the-art multimodal models.
arXiv Detail & Related papers (2023-03-27T07:07:33Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Quantization Robust Federated Learning for Efficient Inference on
Heterogeneous Devices [18.1568276196989]
Federated Learning (FL) is a paradigm to distributively learn machine learning models from decentralized data that remains on-device.
We introduce multiple variants of federated averaging algorithm that train neural networks robust to quantization.
Our results demonstrate that integrating quantization robustness results in FL models that are significantly more robust to different bit-widths during quantized on-device inference.
arXiv Detail & Related papers (2022-06-22T05:11:44Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.