Learning noise-induced transitions by multi-scaling reservoir computing
- URL: http://arxiv.org/abs/2309.05413v1
- Date: Mon, 11 Sep 2023 12:26:36 GMT
- Title: Learning noise-induced transitions by multi-scaling reservoir computing
- Authors: Zequn Lin, Zhaofan Lu, Zengru Di, Ying Tang
- Abstract summary: We develop a machine learning model, reservoir computing as a type of recurrent neural network, to learn noise-induced transitions.
The trained model generates accurate statistics of transition time and the number of transitions.
It is also aware of the asymmetry of the double-well potential, the rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems.
- Score: 2.9170682727903863
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Noise is usually regarded as adversarial to extract the effective dynamics
from time series, such that the conventional data-driven approaches usually aim
at learning the dynamics by mitigating the noisy effect. However, noise can
have a functional role of driving transitions between stable states underlying
many natural and engineered stochastic dynamics. To capture such stochastic
transitions from data, we find that leveraging a machine learning model,
reservoir computing as a type of recurrent neural network, can learn
noise-induced transitions. We develop a concise training protocol for tuning
hyperparameters, with a focus on a pivotal hyperparameter controlling the time
scale of the reservoir dynamics. The trained model generates accurate
statistics of transition time and the number of transitions. The approach is
applicable to a wide class of systems, including a bistable system under a
double-well potential, with either white noise or colored noise. It is also
aware of the asymmetry of the double-well potential, the rotational dynamics
caused by non-detailed balance, and transitions in multi-stable systems. For
the experimental data of protein folding, it learns the transition time between
folded states, providing a possibility of predicting transition statistics from
a small dataset. The results demonstrate the capability of machine-learning
methods in capturing noise-induced phenomena.
Related papers
- Randomised benchmarking for characterizing and forecasting correlated
processes [8.788375252357945]
We develop a method to learn the details of temporally correlated noise.
In particular, we can learn the time-independent evolution operator of system plus bath.
We exemplify this by implementing our method on a superconducting quantum processor.
arXiv Detail & Related papers (2023-12-11T01:55:44Z) - Stochastic Latent Transformer: Efficient Modelling of Stochastically
Forced Zonal Jets [0.0]
We present a novel deep probabilistic learning approach, the 'Stochastic Latent Transformer' (SLT)
The SLT accurately reproduces system dynamics across various integration periods, validated through quantitative diagnostics.
It achieves a five-order-of-magnitude speedup in emulating the zonally-averaged flow.
arXiv Detail & Related papers (2023-10-25T16:17:00Z) - Value function estimation using conditional diffusion models for control [62.27184818047923]
We propose a simple algorithm called Diffused Value Function (DVF)
It learns a joint multi-step model of the environment-robot interaction dynamics using a diffusion model.
We show how DVF can be used to efficiently capture the state visitation measure for multiple controllers.
arXiv Detail & Related papers (2023-06-09T18:40:55Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - Operator inference with roll outs for learning reduced models from
scarce and low-quality data [0.0]
We propose to combine data-driven modeling via operator inference with the dynamic training via roll outs of neural ordinary differential equations.
We show that operator inference with roll outs provides predictive models from training trajectories even if data are sampled sparsely in time and polluted with noise of up to 10%.
arXiv Detail & Related papers (2022-12-02T19:41:31Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Extracting stochastic dynamical systems with $\alpha$-stable L\'evy
noise from data [14.230182518492311]
We propose a data-driven method to extract systems with $alpha$-stable L'evy noise from short burst data.
More specifically, we first estimate the L'evy jump measure and noise intensity.
Then we approximate the drift coefficient by combining nonlocal Kramers-Moyal formulas with normalizing flows.
arXiv Detail & Related papers (2021-09-30T06:57:42Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.