Quantum Time-Series Learning with Evolutionary Algorithms
- URL: http://arxiv.org/abs/2412.17580v1
- Date: Mon, 23 Dec 2024 13:53:35 GMT
- Title: Quantum Time-Series Learning with Evolutionary Algorithms
- Authors: Vignesh Anantharamakrishnan, Márcio M. Taddei,
- Abstract summary: Variational quantum circuits have arisen as an important method in quantum computing.
We explore the use of evolutionary algorithms for such optimization, specifically for time-series forecasting.
- Score: 0.0
- License:
- Abstract: Variational quantum circuits have arisen as an important method in quantum computing. A crucial step of it is parameter optimization, which is typically tackled through gradient-descent techniques. We advantageously explore instead the use of evolutionary algorithms for such optimization, specifically for time-series forecasting. We perform a comparison, for diverse instances of real-world data, between gradient-descent parameter optimization and covariant-matrix adaptation evolutionary strategy. We observe that gradient descent becomes permanently trapped in local minima that have been avoided by evolutionary algorithms in all tested datasets, reaching up to a six-fold decrease in prediction error. Finally, the combined use of evolutionary and gradient-based techniques is explored, aiming at retaining advantages of both. The results are particularly applicable in scenarios sensitive to gains in accuracy.
Related papers
- Gradient-Variation Online Learning under Generalized Smoothness [56.38427425920781]
gradient-variation online learning aims to achieve regret guarantees that scale with variations in gradients of online functions.
Recent efforts in neural network optimization suggest a generalized smoothness condition, allowing smoothness to correlate with gradient norms.
We provide the applications for fast-rate convergence in games and extended adversarial optimization.
arXiv Detail & Related papers (2024-08-17T02:22:08Z) - Variational quantum algorithm for enhanced continuous variable optical
phase sensing [0.0]
Variational quantum algorithms (VQAs) are hybrid quantum-classical approaches used for tackling a wide range of problems on noisy quantum devices.
We implement a variational algorithm designed for optimized parameter estimation on a continuous variable platform based on squeezed light.
arXiv Detail & Related papers (2023-12-21T14:11:05Z) - Learning to learn with an evolutionary strategy applied to variational
quantum algorithms [0.0]
Variational Quantum Algorithms (VQAs) employ quantum circuits parameterized by $U$, optimized using classical methods to minimize a cost function.
In this article, we introduce a novel optimization approach named Learning to Learn with an Evolutionary Strategy'' (LLES)
LLES treats optimization as a learning problem, utilizing recurrent neural networks to iteratively propose VQA parameters.
arXiv Detail & Related papers (2023-10-26T13:55:01Z) - Lottery Tickets in Evolutionary Optimization: On Sparse
Backpropagation-Free Trainability [0.0]
We study gradient descent (GD)-based sparse training and evolution strategies (ES)
We find that ES explore diverse and flat local optima and do not preserve linear mode connectivity across sparsity levels and independent runs.
arXiv Detail & Related papers (2023-05-31T15:58:54Z) - Improving Gradient Methods via Coordinate Transformations: Applications to Quantum Machine Learning [0.0]
Machine learning algorithms heavily rely on optimization algorithms based on gradients, such as gradient descent and alike.
The overall performance is dependent on the appearance of local minima and barren plateaus, which slow-down calculations and lead to non-optimal solutions.
In this paper we introduce a generic strategy to accelerate and improve the overall performance of such methods, allowing to alleviate the effect of barren plateaus and local minima.
arXiv Detail & Related papers (2023-04-13T18:26:05Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Natural Evolutionary Strategies for Variational Quantum Computation [0.7874708385247353]
Natural evolutionary strategies (NES) are a family of gradient-free black-box optimization algorithms.
This study illustrates their use for the optimization of randomly-d parametrized quantum circuits (PQCs) in the region of vanishing gradients.
arXiv Detail & Related papers (2020-11-30T21:23:38Z) - Adaptive Gradient Method with Resilience and Momentum [120.83046824742455]
We propose an Adaptive Gradient Method with Resilience and Momentum (AdaRem)
AdaRem adjusts the parameter-wise learning rate according to whether the direction of one parameter changes in the past is aligned with the direction of the current gradient.
Our method outperforms previous adaptive learning rate-based algorithms in terms of the training speed and the test error.
arXiv Detail & Related papers (2020-10-21T14:49:00Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z) - Towards Better Understanding of Adaptive Gradient Algorithms in
Generative Adversarial Nets [71.05306664267832]
Adaptive algorithms perform gradient updates using the history of gradients and are ubiquitous in training deep neural networks.
In this paper we analyze a variant of OptimisticOA algorithm for nonconcave minmax problems.
Our experiments show that adaptive GAN non-adaptive gradient algorithms can be observed empirically.
arXiv Detail & Related papers (2019-12-26T22:10:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.