Deep Time-Delay Reservoir Computing: Dynamics and Memory Capacity
- URL: http://arxiv.org/abs/2006.06322v2
- Date: Tue, 25 Aug 2020 09:11:43 GMT
- Title: Deep Time-Delay Reservoir Computing: Dynamics and Memory Capacity
- Authors: Mirko Goldmann, Felix K\"oster, Kathy L\"udge and Serhiy Yanchuk
- Abstract summary: We present how the dynamical properties of a deep Ikeda-based reservoir are related to its memory capacity (MC)
We show how the MC is related to the systems distance to bifurcations or magnitude of the conditional Lyapunov exponents.
numerical simulations show resonances between clock cycle and delays of the layers in all degrees of the MC.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Deep Time-Delay Reservoir Computing concept utilizes unidirectionally
connected systems with time-delays for supervised learning. We present how the
dynamical properties of a deep Ikeda-based reservoir are related to its memory
capacity (MC) and how that can be used for optimization. In particular, we
analyze bifurcations of the corresponding autonomous system and compute
conditional Lyapunov exponents, which measure the generalized synchronization
between the input and the layer dynamics. We show how the MC is related to the
systems distance to bifurcations or magnitude of the conditional Lyapunov
exponent. The interplay of different dynamical regimes leads to a adjustable
distribution between linear and nonlinear MC. Furthermore, numerical
simulations show resonances between clock cycle and delays of the layers in all
degrees of the MC. Contrary to MC losses in a single-layer reservoirs, these
resonances can boost separate degrees of the MC and can be used, e.g., to
design a system with maximum linear MC. Accordingly, we present two
configurations that empower either high nonlinear MC or long time linear MC.
Related papers
- Integrating Multimodal Data for Joint Generative Modeling of Complex Dynamics [6.848555909346641]
We provide an efficient framework to combine various sources of information for optimal reconstruction.
Our framework is fully textitgenerative, producing, after training, trajectories with the same geometrical and temporal structure as those of the ground truth system.
arXiv Detail & Related papers (2022-12-15T15:21:28Z) - Data-Driven Time Propagation of Quantum Systems with Neural Networks [0.0]
We investigate the potential of supervised machine learning to propagate a quantum system in time.
We show that neural networks can work as time propagators at any time in the future and that they can bed in time forming an autoregression.
arXiv Detail & Related papers (2022-01-27T17:08:30Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z) - Master memory function for delay-based reservoir computers with
single-variable dynamics [0.0]
We show that many delay-based reservoir computers can be characterized by a universal master memory function (MMF)
Once computed for two independent parameters, this function provides linear memory capacity for any delay-based single-variable reservoir with small inputs.
arXiv Detail & Related papers (2021-08-28T13:17:24Z) - Adaptive Machine Learning for Time-Varying Systems: Low Dimensional
Latent Space Tuning [91.3755431537592]
We present a recently developed method of adaptive machine learning for time-varying systems.
Our approach is to map very high (N>100k) dimensional inputs into the low dimensional (N2) latent space at the output of the encoder section of an encoder-decoder CNN.
This method allows us to learn correlations within and to track their evolution in real time based on feedback without interrupts.
arXiv Detail & Related papers (2021-07-13T16:05:28Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Continuous and time-discrete non-Markovian system-reservoir
interactions: Dissipative coherent quantum feedback in Liouville space [62.997667081978825]
We investigate a quantum system simultaneously exposed to two structured reservoirs.
We employ a numerically exact quasi-2D tensor network combining both diagonal and off-diagonal system-reservoir interactions with a twofold memory for continuous and discrete retardation effects.
As a possible example, we study the non-Markovian interplay between discrete photonic feedback and structured acoustic phononovian modes, resulting in emerging inter-reservoir correlations and long-living population trapping within an initially-excited two-level system.
arXiv Detail & Related papers (2020-11-10T12:38:35Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.