Enhancing Gradient Variance and Differential Privacy in Quantum Federated Learning
- URL: http://arxiv.org/abs/2509.05377v1
- Date: Thu, 04 Sep 2025 15:29:52 GMT
- Title: Enhancing Gradient Variance and Differential Privacy in Quantum Federated Learning
- Authors: Duc-Thien Phan, Minh-Duong Nguyen, Quoc-Viet Pham, Huilong Pi,
- Abstract summary: Quantum Neural Network (QNN) as the local model has recently confronted notable challenges.<n>We propose a new QFL technique that incorporates differential privacy and introduces a dedicated noise estimation strategy.<n>We show that our algorithm effectively balances convergence, reduces communication costs, and mitigates the adverse effects of intermediate quantum noise.
- Score: 5.608916223269914
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Upon integrating Quantum Neural Network (QNN) as the local model, Quantum Federated Learning (QFL) has recently confronted notable challenges. Firstly, exploration is hindered over sharp minima, decreasing learning performance. Secondly, the steady gradient descent results in more stable and predictable model transmissions over wireless channels, making the model more susceptible to attacks from adversarial entities. Additionally, the local QFL model is vulnerable to noise produced by the quantum device's intermediate noise states, since it requires the use of quantum gates and circuits for training. This local noise becomes intertwined with learning parameters during training, impairing model precision and convergence rate. To address these issues, we propose a new QFL technique that incorporates differential privacy and introduces a dedicated noise estimation strategy to quantify and mitigate the impact of intermediate quantum noise. Furthermore, we design an adaptive noise generation scheme to alleviate privacy threats associated with the vanishing gradient variance phenomenon of QNN and enhance robustness against device noise. Experimental results demonstrate that our algorithm effectively balances convergence, reduces communication costs, and mitigates the adverse effects of intermediate quantum noise while maintaining strong privacy protection. Using real-world datasets, we achieved test accuracy of up to 98.47\% for the MNIST dataset and 83.85\% for the CIFAR-10 dataset while maintaining fast execution times.
Related papers
- Continual Quantum Architecture Search with Tensor-Train Encoding: Theory and Applications to Signal Processing [68.35481158940401]
CL-QAS is a continual quantum architecture search framework.<n>It mitigates challenges of costly encoding amplitude and forgetting in variational quantum circuits.<n>It achieves controllable robustness expressivity, sample-efficient generalization, and smooth convergence without barren plateaus.
arXiv Detail & Related papers (2026-01-10T02:36:03Z) - Differentially Private Federated Quantum Learning via Quantum Noise [9.540961602976965]
Quantum federated learning (QFL) enables collaborative training of quantum machine learning (QML) models across distributed quantum devices without raw data exchange.<n>QFL remains vulnerable to adversarial attacks, where shared QML model updates can be exploited to undermine information privacy.<n>This paper explores a novel DP mechanism that harnesses quantum noise to safeguard quantum models throughout the QFL process.
arXiv Detail & Related papers (2025-08-27T22:56:16Z) - Sporadic Federated Learning Approach in Quantum Environment to Tackle Quantum Noise [1.2026018242953707]
SpoQFL dynamically adjusts training strategies based on noise fluctuations.<n>Experiments on real-world datasets demonstrate that SpoQFL significantly outperforms conventional QFL approaches.
arXiv Detail & Related papers (2025-07-15T20:30:11Z) - Provably Robust Training of Quantum Circuit Classifiers Against Parameter Noise [49.97673761305336]
Noise remains a major obstacle to achieving reliable quantum algorithms.<n>We present a provably noise-resilient training theory and algorithm to enhance the robustness of parameterized quantum circuit classifiers.
arXiv Detail & Related papers (2025-05-24T02:51:34Z) - Do we really have to filter out random noise in pre-training data for language models? [38.30911856420568]
Pre-training text data curated from the Internet inevitably contains random noise caused by decoding errors or unregulated web content.<n>We show that the resulting increase in the loss of next-token prediction (NTP) was significantly lower than the proportion of random noise even when the model was scaled up to 2.7B.<n>We introduce a novel plug-and-play Local Gradient Matching loss, which explicitly enhances the denoising capability of the downstream task head by aligning the gradient of normal and perturbed features without requiring knowledge of the model's parameters.
arXiv Detail & Related papers (2025-02-10T16:01:55Z) - Noise-resistant adaptive Hamiltonian learning [30.632260870411177]
An adaptive Hamiltonian learning (AHL) model for data analysis and quantum state simulation is proposed to overcome problems such as low efficiency.<n>A noise-resistant quantum neural network (RQNN) based on AHL is developed, which improves the noise robustness of the quantum neural network.
arXiv Detail & Related papers (2025-01-14T11:12:59Z) - Bayesian Quantum Amplitude Estimation [46.03321798937855]
We present BAE, a problem-tailored and noise-aware Bayesian algorithm for quantum amplitude estimation.<n>In a fault tolerant scenario, BAE is capable of saturating the Heisenberg limit; if device noise is present, BAE can dynamically characterize it and self-adapt.<n>We propose a benchmark for amplitude estimation algorithms and use it to test BAE against other approaches.
arXiv Detail & Related papers (2024-12-05T18:09:41Z) - Compressed-sensing Lindbladian quantum tomography with trapped ions [44.99833362998488]
Characterizing the dynamics of quantum systems is a central task for the development of quantum information processors.
We propose two different improvements of Lindbladian quantum tomography (LQT) that alleviate previous shortcomings.
arXiv Detail & Related papers (2024-03-12T09:58:37Z) - Reconfigurable Intelligent Surface (RIS)-Assisted Entanglement
Distribution in FSO Quantum Networks [62.87033427172205]
Quantum networks (QNs) relying on free-space optical (FSO) quantum channels can support quantum applications in environments where establishing an optical fiber infrastructure is challenging and costly.
A reconfigurable intelligent surface (RIS)-assisted FSO-based QN is proposed as a cost-efficient framework providing a virtual line-of-sight between users for entanglement distribution.
arXiv Detail & Related papers (2024-01-19T17:16:40Z) - Noise-Agnostic Quantum Error Mitigation with Data Augmented Neural Models [9.023862258563893]
We build a neural model that achieves quantum error mitigation without prior knowledge of the noise and without training on noise-free data.<n>Our approach applies to quantum circuits and to the dynamics of many-body and continuous-variable quantum systems.
arXiv Detail & Related papers (2023-11-03T05:52:14Z) - Amplitude-Varying Perturbation for Balancing Privacy and Utility in
Federated Learning [86.08285033925597]
This paper presents a new DP perturbation mechanism with a time-varying noise amplitude to protect the privacy of federated learning.
We derive an online refinement of the series to prevent FL from premature convergence resulting from excessive perturbation noise.
The contribution of the new DP mechanism to the convergence and accuracy of privacy-preserving FL is corroborated, compared to the state-of-the-art Gaussian noise mechanism with a persistent noise amplitude.
arXiv Detail & Related papers (2023-03-07T22:52:40Z) - QuantumNAT: Quantum Noise-Aware Training with Noise Injection, Quantization and Normalization [19.822514659801616]
Quantum Circuits (PQC) are promising towards quantum advantage on near-term quantum hardware.<n>However, due to the large quantum noises (errors), the performance of PQC models has a severe degradation on real quantum devices.<n>We present QuantumNAT, a PQC-specific framework to perform noise-aware optimizations in both training and inference stages to improve robustness.
arXiv Detail & Related papers (2021-10-21T17:59:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.