Role of stochastic noise and generalization error in the time
propagation of neural-network quantum states
- URL: http://arxiv.org/abs/2105.01054v3
- Date: Sun, 21 Nov 2021 19:49:27 GMT
- Title: Role of stochastic noise and generalization error in the time
propagation of neural-network quantum states
- Authors: Damian Hofmann, Giammarco Fabiani, Johan H. Mentink, Giuseppe Carleo,
Michael A. Sentef
- Abstract summary: Neural-network quantum states (NQS) have been shown to be a suitable variational ansatz to simulate out-of-equilibrium dynamics.
We show that stable and accurate time propagation can be achieved in regimes of sufficiently regularized variational dynamics.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural-network quantum states (NQS) have been shown to be a suitable
variational ansatz to simulate out-of-equilibrium dynamics in two-dimensional
systems using time-dependent variational Monte Carlo (t-VMC). In particular,
stable and accurate time propagation over long time scales has been observed in
the square-lattice Heisenberg model using the Restricted Boltzmann machine
architecture. However, achieving similar performance in other systems has
proven to be more challenging. In this article, we focus on the two-leg
Heisenberg ladder driven out of equilibrium by a pulsed excitation as a
benchmark system. We demonstrate that unmitigated noise is strongly amplified
by the nonlinear equations of motion for the network parameters, which causes
numerical instabilities in the time evolution. As a consequence, the achievable
accuracy of the simulated dynamics is a result of the interplay between network
expressiveness and measures required to remedy these instabilities. We show
that stability can be greatly improved by appropriate choice of regularization.
This is particularly useful as tuning of the regularization typically imposes
no additional computational cost. Inspired by machine learning practice, we
propose a validation-set based diagnostic tool to help determining optimal
regularization hyperparameters for t-VMC based propagation schemes. For our
benchmark, we show that stable and accurate time propagation can be achieved in
regimes of sufficiently regularized variational dynamics.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Neural Projected Quantum Dynamics: a systematic study [0.0]
We address the challenge of simulating unitary quantum dynamics in large systems using Neural Quantum States.
This work offers a comprehensive formalization of the projected time-dependent Variational Monte Carlo (p-tVMC) method.
arXiv Detail & Related papers (2024-10-14T17:01:33Z) - Oscillatory State-Space Models [61.923849241099184]
We propose Lineary State-Space models (LinOSS) for efficiently learning on long sequences.
A stable discretization, integrated over time using fast associative parallel scans, yields the proposed state-space model.
We show that LinOSS is universal, i.e., it can approximate any continuous and causal operator mapping between time-varying functions.
arXiv Detail & Related papers (2024-10-04T22:00:13Z) - Unconditional stability of a recurrent neural circuit implementing divisive normalization [0.0]
We prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit.
We show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling.
arXiv Detail & Related papers (2024-09-27T17:46:05Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Dynamics with autoregressive neural quantum states: application to
critical quench dynamics [41.94295877935867]
We present an alternative general scheme that enables one to capture long-time dynamics of quantum systems in a stable fashion.
We apply the scheme to time-dependent quench dynamics by investigating the Kibble-Zurek mechanism in the two-dimensional quantum Ising model.
arXiv Detail & Related papers (2022-09-07T15:50:00Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.