Benchmarking Quantum Models for Time-series Forecasting
- URL: http://arxiv.org/abs/2412.13878v1
- Date: Wed, 18 Dec 2024 14:17:17 GMT
- Title: Benchmarking Quantum Models for Time-series Forecasting
- Authors: Caitlin Jones, Nico Kraus, Pallavi Bhardwaj, Maximilian Adler, Michael Schrödl-Baumann, David Zambrano Manrique,
- Abstract summary: We compare classical and quantum models for time series forecasting.<n>Most of the quantum models were able to achieve comparable results.<n>Results serve as a useful point of comparison for the field of forecasting with quantum machine learning.
- Score: 0.3806074545662052
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series forecasting is a valuable tool for many applications, such as stock price predictions, demand forecasting or logistical optimization. There are many well-established statistical and machine learning models that are used for this purpose. Recently in the field of quantum machine learning many candidate models for forecasting have been proposed, however in the absence of theoretical grounds for advantage thorough benchmarking is essential for scientific evaluation. To this end, we performed a benchmarking study using real data of various quantum models, both gate-based and annealing-based, comparing them to the state-of-the-art classical approaches, including extensive hyperparameter optimization. Overall we found that the best classical models outperformed the best quantum models. Most of the quantum models were able to achieve comparable results and for one data set two quantum models outperformed the classical ARIMA model. These results serve as a useful point of comparison for the field of forecasting with quantum machine learning.
Related papers
- Quantum vs. classical: A comprehensive benchmark study for predicting time series with variational quantum machine learning [0.0]
Variational quantum machine learning algorithms have been proposed as promising tools for time series prediction.
We present a benchmark study comparing a range of variational quantum algorithms and classical machine learning models for time series forecasting.
arXiv Detail & Related papers (2025-04-16T18:29:00Z) - Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.
This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.
The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Application of time-series quantum generative model to financial data [1.2289361708127877]
A time-series generative model was applied as a quantum generative model to actual financial data.
It was observed that fewer parameter values were required compared with the classical method.
arXiv Detail & Related papers (2024-05-20T05:29:45Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Quantum Machine Learning for Credit Scoring [0.0]
We explore the use of quantum machine learning (QML) applied to credit scoring for small and medium-sized enterprises (SME)
A quantum/classical hybrid approach has been used with several models, activation functions, epochs and other parameters.
We observe significantly more efficient training for the quantum models over the classical models with the quantum model trained for 350 epochs compared to 3500 epochs for comparable prediction performance.
arXiv Detail & Related papers (2023-08-07T13:27:30Z) - Benchmarking Quantum Surrogate Models on Scarce and Noisy Data [4.3956739705582635]
We show that quantum neural networks (QNNs) have the potential to outperform their classical analogs in the presence of scarce and noisy data.
Our contribution displays the first application-centered approach of using QNNs as surrogate models on higher dimensional, real world data.
arXiv Detail & Related papers (2023-06-08T08:49:58Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Predictive Models from Quantum Computer Benchmarks [0.0]
holistic benchmarks for quantum computers are essential for testing and summarizing the performance of quantum hardware.
We introduce a general framework for building predictive models from benchmarking data using capability models.
Our case studies use data from cloud-accessible quantum computers and simulations of noisy quantum computers.
arXiv Detail & Related papers (2023-05-15T17:00:23Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Sparse MoEs meet Efficient Ensembles [49.313497379189315]
We study the interplay of two popular classes of such models: ensembles of neural networks and sparse mixture of experts (sparse MoEs)
We present Efficient Ensemble of Experts (E$3$), a scalable and simple ensemble of sparse MoEs that takes the best of both classes of models, while using up to 45% fewer FLOPs than a deep ensemble.
arXiv Detail & Related papers (2021-10-07T11:58:35Z) - Nearest Centroid Classification on a Trapped Ion Quantum Computer [57.5195654107363]
We design a quantum Nearest Centroid classifier, using techniques for efficiently loading classical data into quantum states and performing distance estimations.
We experimentally demonstrate it on a 11-qubit trapped-ion quantum machine, matching the accuracy of classical nearest centroid classifiers for the MNIST handwritten digits dataset and achieving up to 100% accuracy for 8-dimensional synthetic data.
arXiv Detail & Related papers (2020-12-08T01:10:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.