Data-Driven Time Propagation of Quantum Systems with Neural Networks
- URL: http://arxiv.org/abs/2201.11647v1
- Date: Thu, 27 Jan 2022 17:08:30 GMT
- Title: Data-Driven Time Propagation of Quantum Systems with Neural Networks
- Authors: James Nelson, Luuk Coopmans, Graham Kells and Stefano Sanvito
- Abstract summary: We investigate the potential of supervised machine learning to propagate a quantum system in time.
We show that neural networks can work as time propagators at any time in the future and that they can bed in time forming an autoregression.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate the potential of supervised machine learning to propagate a
quantum system in time. While Markovian dynamics can be learned easily, given a
sufficient amount of data, non-Markovian systems are non-trivial and their
description requires the memory knowledge of past states. Here we analyse the
feature of such memory by taking a simple 1D Heisenberg model as many-body
Hamiltonian, and construct a non-Markovian description by representing the
system over the single-particle reduced density matrix. The number of past
states required for this representation to reproduce the time-dependent
dynamics is found to grow exponentially with the number of spins and with the
density of the system spectrum. Most importantly, we demonstrate that neural
networks can work as time propagators at any time in the future and that they
can be concatenated in time forming an autoregression. Such neural-network
autoregression can be used to generate long-time and arbitrary dense time
trajectories. Finally, we investigate the time resolution needed to represent
the system memory. We find two regimes: for fine memory samplings the memory
needed remains constant, while longer memories are required for coarse
samplings, although the total number of time steps remains constant. The
boundary between these two regimes is set by the period corresponding to the
highest frequency in the system spectrum, demonstrating that neural network can
overcome the limitation set by the Shannon-Nyquist sampling theorem.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Deep learning delay coordinate dynamics for chaotic attractors from
partial observable data [0.0]
We utilize deep artificial neural networks to learn discrete discrete time maps and continuous time flows of the partial state.
We demonstrate the capacity of deep ANNs to predict chaotic behavior from a scalar observation on a manifold of dimension three via the Lorenz system.
arXiv Detail & Related papers (2022-11-20T19:25:02Z) - From Tensor Network Quantum States to Tensorial Recurrent Neural
Networks [0.0]
We show that any matrix product state (MPS) can be exactly represented by a recurrent neural network (RNN) with a linear memory update.
We generalize this RNN architecture to 2D lattices using a multilinear memory update.
arXiv Detail & Related papers (2022-06-24T16:25:36Z) - A quantum generative model for multi-dimensional time series using
Hamiltonian learning [0.0]
We propose using the inherent nature of quantum computers to simulate quantum dynamics as a technique to encode such features.
We use the learned model to generate out-of-sample time series and show that it captures unique and complex features of the learned time series.
We experimentally demonstrate the proposed algorithm on an 11-qubit trapped-ion quantum machine.
arXiv Detail & Related papers (2022-04-13T03:06:36Z) - Deep recurrent networks predicting the gap evolution in adiabatic
quantum computing [0.0]
We explore the potential of deep learning for discovering a mapping from the parameters that fully identify a problem Hamiltonian to the parametric dependence of the gap.
We show that a long short-term memory network succeeds in predicting the gap when the parameter space scales linearly with system size.
Remarkably, we show that once this architecture is combined with a convolutional neural network to deal with the spatial structure of the model, the gap evolution can even be predicted for system sizes larger than the ones seen by the neural network during training.
arXiv Detail & Related papers (2021-09-17T12:08:57Z) - Astrocytes mediate analogous memory in a multi-layer neuron-astrocytic
network [52.77024349608834]
We show how a piece of information can be maintained as a robust activity pattern for several seconds then completely disappear if no other stimuli come.
This kind of short-term memory can keep operative information for seconds, then completely forget it to avoid overlapping with forthcoming patterns.
We show how arbitrary patterns can be loaded, then stored for a certain interval of time, and retrieved if the appropriate clue pattern is applied to the input.
arXiv Detail & Related papers (2021-08-31T16:13:15Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Continuous and time-discrete non-Markovian system-reservoir
interactions: Dissipative coherent quantum feedback in Liouville space [62.997667081978825]
We investigate a quantum system simultaneously exposed to two structured reservoirs.
We employ a numerically exact quasi-2D tensor network combining both diagonal and off-diagonal system-reservoir interactions with a twofold memory for continuous and discrete retardation effects.
As a possible example, we study the non-Markovian interplay between discrete photonic feedback and structured acoustic phononovian modes, resulting in emerging inter-reservoir correlations and long-living population trapping within an initially-excited two-level system.
arXiv Detail & Related papers (2020-11-10T12:38:35Z) - Quantum Long Short-Term Memory [3.675884635364471]
Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2020-09-03T16:41:09Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.