Characterizing the loss landscape of variational quantum circuits
- URL: http://arxiv.org/abs/2008.02785v2
- Date: Tue, 2 Mar 2021 08:16:29 GMT
- Title: Characterizing the loss landscape of variational quantum circuits
- Authors: Patrick Huembeli, Alexandre Dauphin
- Abstract summary: We introduce a way to compute the Hessian of the loss function of VQCs.
We show how this information can be interpreted and compared to classical neural networks.
- Score: 77.34726150561087
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning techniques enhanced by noisy intermediate-scale quantum
(NISQ) devices and especially variational quantum circuits (VQC) have recently
attracted much interest and have already been benchmarked for certain problems.
Inspired by classical deep learning, VQCs are trained by gradient descent
methods which allow for efficient training over big parameter spaces. For NISQ
sized circuits, such methods show good convergence. There are however still
many open questions related to the convergence of the loss function and to the
trainability of these circuits in situations of vanishing gradients.
Furthermore, it is not clear how "good" the minima are in terms of
generalization and stability against perturbations of the data and there is,
therefore, a need for tools to quantitatively study the convergence of the
VQCs. In this work, we introduce a way to compute the Hessian of the loss
function of VQCs and show how to characterize the loss landscape with it. The
eigenvalues of the Hessian give information on the local curvature and we
discuss how this information can be interpreted and compared to classical
neural networks. We benchmark our results on several examples, starting with a
simple analytic toy model to provide some intuition about the behavior of the
Hessian, then going to bigger circuits, and also train VQCs on data. Finally,
we show how the Hessian can be used to adjust the learning rate for faster
convergence during the training of variational circuits.
Related papers
- On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - Improving Parameter Training for VQEs by Sequential Hamiltonian Assembly [4.646930308096446]
A central challenge in quantum machine learning is the design and training of parameterized quantum circuits (PQCs)
We propose a Sequential Hamiltonian Assembly, which iteratively approximates the loss function using local components.
Our approach outperforms conventional parameter training by 29.99% and the empirical state of the art, Layerwise Learning, by 5.12% in the mean accuracy.
arXiv Detail & Related papers (2023-12-09T11:47:32Z) - Backpropagation scaling in parameterised quantum circuits [0.0]
We introduce circuits that are not known to be classically simulable and admit gradient estimation with significantly fewer circuits.
Specifically, these circuits allow for fast estimation of the gradient, higher order partial derivatives and the Fisher information matrix.
In a toy classification problem on 16 qubits, such circuits show competitive performance with other methods, while reducing the training cost by about two orders of magnitude.
arXiv Detail & Related papers (2023-06-26T18:00:09Z) - Weight Re-Mapping for Variational Quantum Algorithms [54.854986762287126]
We introduce the concept of weight re-mapping for variational quantum circuits (VQCs)
We employ seven distinct weight re-mapping functions to assess their impact on eight classification datasets.
Our results indicate that weight re-mapping can enhance the convergence speed of the VQC.
arXiv Detail & Related papers (2023-06-09T09:42:21Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Theoretical Error Performance Analysis for Variational Quantum Circuit
Based Functional Regression [83.79664725059877]
In this work, we put forth an end-to-end quantum neural network, namely, TTN-VQC, for dimensionality reduction and functional regression.
We also characterize the optimization properties of TTN-VQC by leveraging the Polyak-Lojasiewicz (PL) condition.
arXiv Detail & Related papers (2022-06-08T06:54:07Z) - Mode connectivity in the loss landscape of parameterized quantum
circuits [1.7546369508217283]
Variational training of parameterized quantum circuits (PQCs) underpins many employed on near-term noisy intermediate scale quantum (NISQ) devices.
We adapt the qualitative loss landscape characterization for neural networks introduced in citegoodfellowqualitatively,li 2017visualizing and tests for connectivity used in citedraxler 2018essentially to study the loss landscape features in PQC training.
arXiv Detail & Related papers (2021-11-09T18:28:46Z) - Variational Quantum Classifiers Through the Lens of the Hessian [0.0]
In quantum computing, variational quantum algorithms (VQAs) are well suited for finding optimal combinations of things.
The training of VQAs with gradient descent optimization algorithm has shown a good convergence.
Just like classical deep learning, variational quantum circuits suffer from vanishing gradient problems.
arXiv Detail & Related papers (2021-05-21T06:57:34Z) - Quantum-enhanced data classification with a variational entangled sensor
network [3.1083620257082707]
Supervised learning assisted by an entangled sensor network (SLAEN) is a distinct paradigm that harnesses VQCs trained by classical machine-learning algorithms.
Our work paves a new route for quantum-enhanced data processing and its applications in the NISQ era.
arXiv Detail & Related papers (2020-06-22T01:22:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.