Strategic Data Re-Uploads: A Pathway to Improved Quantum Classification Data Re-Uploading Strategies for Improved Quantum Classifier Performance
- URL: http://arxiv.org/abs/2405.09377v2
- Date: Mon, 04 Nov 2024 21:27:44 GMT
- Title: Strategic Data Re-Uploads: A Pathway to Improved Quantum Classification Data Re-Uploading Strategies for Improved Quantum Classifier Performance
- Authors: S. Aminpour, Y. Banad, S. Sharif,
- Abstract summary: Re-uploading classical information into quantum states multiple times can enhance the accuracy of quantum classifiers.
We demonstrate our approach to two classification patterns: a linear classification pattern (LCP) and a non-linear classification pattern (NLCP)
- Score: 0.0
- License:
- Abstract: Quantum machine learning (QML) is a promising field that explores the applications of quantum computing to machine learning tasks. A significant hurdle in the advancement of quantum machine learning lies in the development of efficient and resilient quantum classifiers capable of accurately mapping input data to specific, discrete target outputs. In this paper, we propose a novel approach to improve quantum classifier performance by using a data re-uploading strategy. Re-uploading classical information into quantum states multiple times can enhance the accuracy of quantum classifiers. We investigate the effects of different cost functions, such as fidelity and trace distance, on the optimization process and the classification results. We demonstrate our approach to two classification patterns: a linear classification pattern (LCP) and a non-linear classification pattern (NLCP). We evaluate the efficacy of our approach by benchmarking it against four distinct optimization techniques: L-BFGS-B, COBYLA, Nelder-Mead, and SLSQP. Additionally, we study the different impacts of fixed datasets and random datasets. Our results show that our approach can achieve high classification accuracy and robustness and outperform the existing quantum classifier models.
Related papers
- Benchmarking quantum machine learning kernel training for classification tasks [0.0]
This work performs a benchmark study of Quantum Kernel Estimation (QKE) and Quantum Kernel Training (QKT) with a focus on classification tasks.
Two quantum feature mappings, namely ZZFeatureMap and CovariantFeatureMap, are analyzed in this context.
Experimental results indicate that quantum methods exhibit varying performance across different datasets.
arXiv Detail & Related papers (2024-08-17T10:53:06Z) - Bayesian Parameterized Quantum Circuit Optimization (BPQCO): A task and hardware-dependent approach [49.89480853499917]
Variational quantum algorithms (VQA) have emerged as a promising quantum alternative for solving optimization and machine learning problems.
In this paper, we experimentally demonstrate the influence of the circuit design on the performance obtained for two classification problems.
We also study the degradation of the obtained circuits in the presence of noise when simulating real quantum computers.
arXiv Detail & Related papers (2024-04-17T11:00:12Z) - Quantum Data Encoding: A Comparative Analysis of Classical-to-Quantum
Mapping Techniques and Their Impact on Machine Learning Accuracy [0.0]
This research explores the integration of quantum data embedding techniques into classical machine learning (ML) algorithms.
Our findings reveal that quantum data embedding contributes to improved classification accuracy and F1 scores.
arXiv Detail & Related papers (2023-11-17T08:00:08Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Active Learning on a Programmable Photonic Quantum Processor [6.762439942352232]
Training a quantum machine learning model requires a large labeled dataset, which incurs high labeling and computational costs.
To reduce such costs, a selective training strategy, called active learning (AL), chooses only a subset of the original dataset to learn.
Here, we design and implement two AL-enpowered variational quantum classifiers, to investigate the potential applications and effectiveness of AL in quantum machine learning.
arXiv Detail & Related papers (2022-08-03T14:34:12Z) - Variational Quantum Approximate Support Vector Machine With Inference
Transfer [0.8057006406834467]
A kernel-based quantum machine learning technique for hyperlinear classification of complex data is presented.
A support vector machine can be realized inherently and explicitly on quantum circuits.
The accuracy of iris data classification reached 98.8%.
arXiv Detail & Related papers (2022-06-29T09:56:59Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Binary classifiers for noisy datasets: a comparative study of existing
quantum machine learning frameworks and some new approaches [0.0]
We apply Quantum Machine Learning frameworks to improve binary classification.
noisy datasets are in financial datasets.
New models exhibit better learning characteristics to asymmetrical noise in the dataset.
arXiv Detail & Related papers (2021-11-05T10:29:05Z) - Quantum Machine Learning with SQUID [64.53556573827525]
We present the Scaled QUantum IDentifier (SQUID), an open-source framework for exploring hybrid Quantum-Classical algorithms for classification problems.
We provide examples of using SQUID in a standard binary classification problem from the popular MNIST dataset.
arXiv Detail & Related papers (2021-04-30T21:34:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.