A Review of Barren Plateaus in Variational Quantum Computing
- URL: http://arxiv.org/abs/2405.00781v1
- Date: Wed, 1 May 2024 18:00:10 GMT
- Title: A Review of Barren Plateaus in Variational Quantum Computing
- Authors: Martin Larocca, Supanut Thanasilp, Samson Wang, Kunal Sharma, Jacob Biamonte, Patrick J. Coles, Lukasz Cincio, Jarrod R. McClean, Zoƫ Holmes, M. Cerezo,
- Abstract summary: Variational quantum computing offers a flexible computational paradigm with applications in diverse areas.
A key obstacle to realizing their potential is the Barren Plateau (BP) phenomenon.
This article provides a comprehensive review of the current understanding of the BP phenomenon.
- Score: 0.32360458646569984
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational quantum computing offers a flexible computational paradigm with applications in diverse areas. However, a key obstacle to realizing their potential is the Barren Plateau (BP) phenomenon. When a model exhibits a BP, its parameter optimization landscape becomes exponentially flat and featureless as the problem size increases. Importantly, all the moving pieces of an algorithm -- choices of ansatz, initial state, observable, loss function and hardware noise -- can lead to BPs when ill-suited. Due to the significant impact of BPs on trainability, researchers have dedicated considerable effort to develop theoretical and heuristic methods to understand and mitigate their effects. As a result, the study of BPs has become a thriving area of research, influencing and cross-fertilizing other fields such as quantum optimal control, tensor networks, and learning theory. This article provides a comprehensive review of the current understanding of the BP phenomenon.
Related papers
- Estimates of loss function concentration in noisy parametrized quantum circuits [0.0]
We introduce a new analytical formulation that enables precise calculation of the variance in deep quantum circuits.
We show the emergence of a noise-induced absorption mechanism, a phenomenon that cannot arise in the purely reversible context of unitary quantum computing.
Our framework applies to both unitary and non-unitary dynamics, allowing us to establish a deeper connection between the noise resilience of PQCs and the potential to enhance their expressive power.
arXiv Detail & Related papers (2024-10-02T18:00:09Z) - Deep Attentive Belief Propagation: Integrating Reasoning and Learning
for Solving Constraint Optimization Problems [24.63675651321079]
Belief Propagation (BP) is an important message-passing algorithm for various reasoning tasks over graphical models.
We propose a novel self-supervised learning algorithm for DABP with a smoothed solution cost.
Our model significantly outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2022-09-24T13:03:46Z) - Experimental validation of the Kibble-Zurek Mechanism on a Digital
Quantum Computer [62.997667081978825]
The Kibble-Zurek mechanism captures the essential physics of nonequilibrium quantum phase transitions with symmetry breaking.
We experimentally tested the KZM for the simplest quantum case, a single qubit under the Landau-Zener evolution.
We report on extensive IBM-Q experiments on individual qubits embedded in different circuit environments and topologies.
arXiv Detail & Related papers (2022-08-01T18:00:02Z) - Backpropagation at the Infinitesimal Inference Limit of Energy-Based
Models: Unifying Predictive Coding, Equilibrium Propagation, and Contrastive
Hebbian Learning [41.58529335439799]
How the brain performs credit assignment is a fundamental unsolved problem in neuroscience.
Many biologically plausible' algorithms have been proposed, which compute gradients that approximate those computed by backpropagation (BP)
arXiv Detail & Related papers (2022-05-31T20:48:52Z) - A Theoretical View of Linear Backpropagation and Its Convergence [55.69505060636719]
Backpropagation (BP) is widely used for calculating gradients in deep neural networks (DNNs)
Recently, a linear variant of BP named LinBP was introduced for generating more transferable adversarial examples for performing black-box attacks.
We provide theoretical analyses on LinBP in neural-network-involved learning tasks, including adversarial attack and model training.
arXiv Detail & Related papers (2021-12-21T07:18:00Z) - Where Should We Begin? A Low-Level Exploration of Weight Initialization
Impact on Quantized Behaviour of Deep Neural Networks [93.4221402881609]
We present an in-depth, fine-grained ablation study of the effect of different weights initialization on the final distributions of weights and activations of different CNN architectures.
To our best knowledge, we are the first to perform such a low-level, in-depth quantitative analysis of weights initialization and its effect on quantized behaviour.
arXiv Detail & Related papers (2020-11-30T06:54:28Z) - Higher Order Derivatives of Quantum Neural Networks with Barren Plateaus [0.0]
We show that the elements of the Hessian are exponentially suppressed in a Barren Plateau (BP)
BPs will impact optimization strategies that go beyond (first-order) gradient descent.
We prove novel, general formulas that can be used to analytically evaluate any high-order partial derivative on quantum hardware.
arXiv Detail & Related papers (2020-08-17T16:27:23Z) - Quantum Non-equilibrium Many-Body Spin-Photon Systems [91.3755431537592]
dissertation concerns the quantum dynamics of strongly-correlated quantum systems in out-of-equilibrium states.
Our main results can be summarized in three parts: Signature of Critical Dynamics, Driven Dicke Model as a Test-bed of Ultra-Strong Coupling, and Beyond the Kibble-Zurek Mechanism.
arXiv Detail & Related papers (2020-07-23T19:05:56Z) - Belief Propagation Neural Networks [103.97004780313105]
We introduce belief propagation neural networks (BPNNs)
BPNNs operate on factor graphs and generalize Belief propagation (BP)
We show that BPNNs converges 1.7x faster on Ising models while providing tighter bounds.
On challenging model counting problems, BPNNs compute estimates 100's of times faster than state-of-the-art handcrafted methods.
arXiv Detail & Related papers (2020-07-01T07:39:51Z) - A Theoretical Framework for Target Propagation [75.52598682467817]
We analyze target propagation (TP), a popular but not yet fully understood alternative to backpropagation (BP)
Our theory shows that TP is closely related to Gauss-Newton optimization and thus substantially differs from BP.
We provide a first solution to this problem through a novel reconstruction loss that improves feedback weight training.
arXiv Detail & Related papers (2020-06-25T12:07:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.