Illustration of Barren Plateaus in Quantum Computing
- URL: http://arxiv.org/abs/2602.16558v1
- Date: Wed, 18 Feb 2026 15:56:54 GMT
- Title: Illustration of Barren Plateaus in Quantum Computing
- Authors: Gerhard Stenzel, Tobias Rohe, Michael Kölle, Leo Sünkel, Jonas Stein, Claudia Linnhoff-Popien,
- Abstract summary: Variational Quantum Circuits (VQCs) have emerged as a promising paradigm for quantum machine learning in the NISQ era.<n>This paper investigates how parameter sharing fundamentally alters the optimization landscape through deceptive gradients.
- Score: 2.6652671351756125
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Variational Quantum Circuits (VQCs) have emerged as a promising paradigm for quantum machine learning in the NISQ era. While parameter sharing in VQCs can reduce the parameter space dimensionality and potentially mitigate the barren plateau phenomenon, it introduces a complex trade-off that has been largely overlooked. This paper investigates how parameter sharing, despite creating better global optima with fewer parameters, fundamentally alters the optimization landscape through deceptive gradients -- regions where gradient information exists but systematically misleads optimizers away from global optima. Through systematic experimental analysis, we demonstrate that increasing degrees of parameter sharing generate more complex solution landscapes with heightened gradient magnitudes and measurably higher deceptiveness ratios. Our findings reveal that traditional gradient-based optimizers (Adam, SGD) show progressively degraded convergence as parameter sharing increases, with performance heavily dependent on hyperparameter selection. We introduce a novel gradient deceptiveness detection algorithm and a quantitative framework for measuring optimization difficulty in quantum circuits, establishing that while parameter sharing can improve circuit expressivity by orders of magnitude, this comes at the cost of significantly increased landscape deceptiveness. These insights provide important considerations for quantum circuit design in practical applications, highlighting the fundamental mismatch between classical optimization strategies and quantum parameter landscapes shaped by parameter sharing.
Related papers
- Hyperparameter Trajectory Inference with Conditional Lagrangian Optimal Transport [51.56484100374058]
Post-deployment, user preferences can evolve, making initial settings undesirable.<n>We learn, from observed data, how a NN's conditional output distribution changes with its hyper parameters.<n>We construct a surrogate model that approximates the NN at unobserved hyper parameters.
arXiv Detail & Related papers (2026-03-02T11:55:02Z) - Pattern or Not? QAOA Parameter Heuristics and Potentials of Parsimony [3.230880354632914]
Structured variational quantum algorithms such as the Quantum Approximate optimisation Algorithm (QAOA) have emerged as leading candidates for exploiting advantages of near-term quantum hardware.<n>We systematically investigate the role of classical parameters in QAOA performance through extensive numerical simulations.<n>Our results demonstrate that: (i) optimal parameters often deviate substantially from expected patterns; (ii) QAOA performance becomes progressively less sensitive to specific parameter choices as depth increases; and (iii) iterative component-wise fixing performs on par with, and at shallow depth may even outperform, several established parameter-selection strategies.
arXiv Detail & Related papers (2025-10-09T12:35:30Z) - Efficient Hyperparameter Tuning via Trajectory Invariance Principle [35.90572735438328]
We identify a phenomenon we call trajectory invariance, where pre-training loss curves, gradient noise, and gradient norm exhibit invariance--closely overlapping--with respect to a quantity that combines learning rate and weight decay.<n>This phenomenon effectively reduces the original two-dimensional hyper parameter space to one dimension, yielding an efficient tuning rule.<n>Overall, our work proposes new principles for efficient tuning and inspires future research on scaling laws.
arXiv Detail & Related papers (2025-09-29T17:01:19Z) - Looking elsewhere: improving variational Monte Carlo gradients by importance sampling [41.94295877935867]
Neural-network quantum states (NQS) offer a powerful and expressive ansatz for representing quantum many-body wave functions.<n>It is well known that some scenarios - such as sharply peaked wave functions emerging in quantum chemistry - lead to high-variance gradient estimators hindering the effectiveness of variational optimizations.<n>In this work we investigate a systematic strategy to tackle those sampling issues by means of adaptively tuned importance sampling.<n>Our approach can reduce the computational cost of vanilla VMC considerably, up to a factor of 100x when targeting highly peaked quantum chemistry wavefunctions.
arXiv Detail & Related papers (2025-07-07T18:00:03Z) - Transferring linearly fixed QAOA angles: performance and real device results [0.0]
We investigate a simplified approach that combines linear parameterization with parameter transferring, reducing the parameter space to just 4 dimensions regardless of the number of layers.<n>We compare this combined approach with standard QAOA and other parameter setting strategies such as INTERP and FOURIER, which require computationally demanding incremental layer-by-layer optimization.<n>Our experiments extend from classical simulation to actual quantum hardware implementation on IBM's Eagle processor, demonstrating the approach's viability on current NISQ devices.
arXiv Detail & Related papers (2025-04-17T04:17:51Z) - Compact Multi-Threshold Quantum Information Driven Ansatz For Strongly Interactive Lattice Spin Models [0.0]
We introduce a systematic procedure for ansatz building based on approximate Quantum Mutual Information (QMI)
Our approach generates a layered-structured ansatz, where each layer's qubit pairs are selected based on their QMI values, resulting in more efficient state preparation and optimization routines.
Our results show that the Multi-QIDA method reduces the computational complexity while maintaining high precision, making it a promising tool for quantum simulations in lattice spin models.
arXiv Detail & Related papers (2024-08-05T17:07:08Z) - Scaling Exponents Across Parameterizations and Optimizers [94.54718325264218]
We propose a new perspective on parameterization by investigating a key assumption in prior work.
Our empirical investigation includes tens of thousands of models trained with all combinations of threes.
We find that the best learning rate scaling prescription would often have been excluded by the assumptions in prior work.
arXiv Detail & Related papers (2024-07-08T12:32:51Z) - Bayesian Parameterized Quantum Circuit Optimization (BPQCO): A task and hardware-dependent approach [49.89480853499917]
Variational quantum algorithms (VQA) have emerged as a promising quantum alternative for solving optimization and machine learning problems.
In this paper, we experimentally demonstrate the influence of the circuit design on the performance obtained for two classification problems.
We also study the degradation of the obtained circuits in the presence of noise when simulating real quantum computers.
arXiv Detail & Related papers (2024-04-17T11:00:12Z) - Dependency Structure Search Bayesian Optimization for Decision Making Models [29.95525433889418]
We propose a compact multi-layered architecture modeling the dynamics of agent interactions through the concept of role.
We show strong empirical results under malformed or sparse reward.
arXiv Detail & Related papers (2023-08-01T15:56:24Z) - Parsimonious Optimisation of Parameters in Variational Quantum Circuits [1.303764728768944]
We propose a novel Quantum-Gradient Sampling that requires the execution of at most two circuits per iteration to update the optimisable parameters.
Our proposed method achieves similar convergence rates to classical gradient descent, and empirically outperforms gradient coordinate descent, and SPSA.
arXiv Detail & Related papers (2023-06-20T18:50:18Z) - Symmetric Pruning in Quantum Neural Networks [111.438286016951]
Quantum neural networks (QNNs) exert the power of modern quantum machines.
QNNs with handcraft symmetric ansatzes generally experience better trainability than those with asymmetric ansatzes.
We propose the effective quantum neural tangent kernel (EQNTK) to quantify the convergence of QNNs towards the global optima.
arXiv Detail & Related papers (2022-08-30T08:17:55Z) - FLIP: A flexible initializer for arbitrarily-sized parametrized quantum
circuits [105.54048699217668]
We propose a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters.
We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.
arXiv Detail & Related papers (2021-03-15T17:38:33Z) - NTopo: Mesh-free Topology Optimization using Implicit Neural
Representations [35.07884509198916]
We present a novel machine learning approach to tackle topology optimization problems.
We use multilayer perceptrons (MLPs) to parameterize both density and displacement fields.
As we show through our experiments, a major benefit of our approach is that it enables self-supervised learning of continuous solution spaces.
arXiv Detail & Related papers (2021-02-22T05:25:22Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Large gradients via correlation in random parameterized quantum circuits [0.0]
The presence of exponentially vanishing gradients in cost function landscapes is an obstacle to optimization by gradient descent methods.
We prove that reducing the dimensionality of the parameter space can allow one to circumvent the vanishing gradient phenomenon.
arXiv Detail & Related papers (2020-05-25T16:15:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.