Benchmarking Quantum Surrogate Models on Scarce and Noisy Data
- URL: http://arxiv.org/abs/2306.05042v3
- Date: Sat, 9 Dec 2023 12:26:10 GMT
- Title: Benchmarking Quantum Surrogate Models on Scarce and Noisy Data
- Authors: Jonas Stein, Michael Poppel, Philip Adamczyk, Ramona Fabry, Zixin Wu,
Michael K\"olle, Jonas N\"u{\ss}lein, Dani\"elle Schuman, Philipp Altmann,
Thomas Ehmer, Vijay Narasimhan, Claudia Linnhoff-Popien
- Abstract summary: We show that quantum neural networks (QNNs) have the potential to outperform their classical analogs in the presence of scarce and noisy data.
Our contribution displays the first application-centered approach of using QNNs as surrogate models on higher dimensional, real world data.
- Score: 4.3956739705582635
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Surrogate models are ubiquitously used in industry and academia to
efficiently approximate given black box functions. As state-of-the-art methods
from classical machine learning frequently struggle to solve this problem
accurately for the often scarce and noisy data sets in practical applications,
investigating novel approaches is of great interest. Motivated by recent
theoretical results indicating that quantum neural networks (QNNs) have the
potential to outperform their classical analogs in the presence of scarce and
noisy data, we benchmark their qualitative performance for this scenario
empirically. Our contribution displays the first application-centered approach
of using QNNs as surrogate models on higher dimensional, real world data. When
compared to a classical artificial neural network with a similar number of
parameters, our QNN demonstrates significantly better results for noisy and
scarce data, and thus motivates future work to explore this potential quantum
advantage in surrogate modelling. Finally, we demonstrate the performance of
current NISQ hardware experimentally and estimate the gate fidelities necessary
to replicate our simulation results.
Related papers
- Data re-uploading in Quantum Machine Learning for time series: application to traffic forecasting [1.2885961238169932]
We present the first application of quantum data re-uploading in the context of transport forecasting.
This technique allows quantum models to better capture complex patterns, such as traffic dynamics, by repeatedly encoding classical data into a quantum state.
Our results show that hybrid models achieve competitive accuracy with state-of-the-art classical methods, especially when the number of qubits and re-uploading blocks is increased.
arXiv Detail & Related papers (2025-01-22T10:21:00Z) - Learning Density Functionals from Noisy Quantum Data [0.0]
noisy intermediate-scale quantum (NISQ) devices are used to generate training data for machine learning (ML) models.
We show that a neural-network ML model can successfully generalize from small datasets subject to noise typical of NISQ algorithms.
Our findings suggest a promising pathway for leveraging NISQ devices in practical quantum simulations.
arXiv Detail & Related papers (2024-09-04T17:59:55Z) - Coherent Feed Forward Quantum Neural Network [2.1178416840822027]
Quantum machine learning, focusing on quantum neural networks (QNNs), remains a vastly uncharted field of study.
We introduce a bona fide QNN model, which seamlessly aligns with the versatility of a traditional FFNN in terms of its adaptable intermediate layers and nodes.
We test our proposed model on various benchmarking datasets such as the diagnostic breast cancer (Wisconsin) and credit card fraud detection datasets.
arXiv Detail & Related papers (2024-02-01T15:13:26Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Binary classifiers for noisy datasets: a comparative study of existing
quantum machine learning frameworks and some new approaches [0.0]
We apply Quantum Machine Learning frameworks to improve binary classification.
noisy datasets are in financial datasets.
New models exhibit better learning characteristics to asymmetrical noise in the dataset.
arXiv Detail & Related papers (2021-11-05T10:29:05Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Self-Supervised Learning [22.953284192004034]
We propose a hybrid quantum-classical neural network architecture for contrastive self-supervised learning.
We apply our best quantum model to classify unseen images on the ibmq_paris quantum computer.
arXiv Detail & Related papers (2021-03-26T18:00:00Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.