Federated Quantum-Train with Batched Parameter Generation
- URL: http://arxiv.org/abs/2409.02763v1
- Date: Wed, 4 Sep 2024 14:39:11 GMT
- Title: Federated Quantum-Train with Batched Parameter Generation
- Authors: Chen-Yu Liu, Samuel Yen-Chi Chen,
- Abstract summary: We introduce the Federated Quantum-Train (QT) framework, which integrates the QT model into federated learning.
Our approach significantly reduces qubit usage from 19 to as low as 8 qubits while reducing generalization error.
- Score: 3.697453416360906
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we introduce the Federated Quantum-Train (QT) framework, which integrates the QT model into federated learning to leverage quantum computing for distributed learning systems. Quantum client nodes employ Quantum Neural Networks (QNNs) and a mapping model to generate local target model parameters, which are updated and aggregated at a central node. Testing with a VGG-like convolutional neural network on the CIFAR-10 dataset, our approach significantly reduces qubit usage from 19 to as low as 8 qubits while reducing generalization error. The QT method mitigates overfitting observed in classical models, aligning training and testing accuracy and improving performance in highly compressed models. Notably, the Federated QT framework does not require a quantum computer during inference, enhancing practicality given current quantum hardware limitations. This work highlights the potential of integrating quantum techniques into federated learning, paving the way for advancements in quantum machine learning and distributed learning systems.
Related papers
- Quantum Convolutional Neural Network: A Hybrid Quantum-Classical Approach for Iris Dataset Classification [0.0]
We present a hybrid quantum-classical machine learning model for classification tasks, integrating a 4-qubit quantum circuit with a classical neural network.
The model was trained over 20 epochs, achieving a perfect 100% accuracy on the Iris dataset test set on 16 epoch.
This work contributes to the growing body of research on hybrid quantum-classical models and their applicability to real-world datasets.
arXiv Detail & Related papers (2024-10-21T13:15:12Z) - Quantum-Train with Tensor Network Mapping Model and Distributed Circuit Ansatz [0.8192907805418583]
Quantum-Train (QT) is a hybrid quantum-classical machine learning framework.
It maps quantum state measurements to classical neural network weights.
Traditional QT framework employs a multi-layer perceptron (MLP) for this task, but it struggles with scalability and interpretability.
We introduce a distributed circuit ansatz designed for large-scale quantum machine learning with multiple small quantum processing unit nodes.
arXiv Detail & Related papers (2024-09-11T03:51:34Z) - CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective [7.7063925534143705]
We introduce the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with machine learning algorithms.
QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model.
arXiv Detail & Related papers (2024-05-18T14:35:57Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.