Learning non-stationary Langevin dynamics from stochastic observations
of latent trajectories
- URL: http://arxiv.org/abs/2012.14944v1
- Date: Tue, 29 Dec 2020 21:22:21 GMT
- Title: Learning non-stationary Langevin dynamics from stochastic observations
of latent trajectories
- Authors: Mikhail Genkin, Owen Hughes, and Tatiana A. Engel
- Abstract summary: Inferring Langevin equations from data can reveal how transient dynamics of such systems give rise to their function.
Here we present a non-stationary framework for inferring the Langevin equation, which explicitly models the observation process and non-stationary latent dynamics.
Omitting any of these non-stationary components results in incorrect inference, in which erroneous features arise in the dynamics due to non-stationary data distribution.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many complex systems operating far from the equilibrium exhibit stochastic
dynamics that can be described by a Langevin equation. Inferring Langevin
equations from data can reveal how transient dynamics of such systems give rise
to their function. However, dynamics are often inaccessible directly and can be
only gleaned through a stochastic observation process, which makes the
inference challenging. Here we present a non-parametric framework for inferring
the Langevin equation, which explicitly models the stochastic observation
process and non-stationary latent dynamics. The framework accounts for the
non-equilibrium initial and final states of the observed system and for the
possibility that the system's dynamics define the duration of observations.
Omitting any of these non-stationary components results in incorrect inference,
in which erroneous features arise in the dynamics due to non-stationary data
distribution. We illustrate the framework using models of neural dynamics
underlying decision making in the brain.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Inferring the Langevin Equation with Uncertainty via Bayesian Neural
Networks [4.604003661048267]
We present a comprehensive framework that employs Bayesian neural networks for inferring Langevin equations in both overdamped and underdamped regimes.
By providing a distribution of predictions instead of a single value, our approach allows us to assess prediction uncertainties.
We demonstrate the effectiveness of our framework in inferring Langevin equations for various scenarios including a neuron model and microscopic engine.
arXiv Detail & Related papers (2024-02-02T11:47:56Z) - An optimization-based equilibrium measure describes non-equilibrium steady state dynamics: application to edge of chaos [2.5690340428649328]
Understanding neural dynamics is a central topic in machine learning, non-linear physics and neuroscience.
The dynamics is non-linear, and particularly non-gradient, i.e., the driving force can not be written as gradient of a potential.
arXiv Detail & Related papers (2024-01-18T14:25:32Z) - Causal Modeling with Stationary Diffusions [89.94899196106223]
We learn differential equations whose stationary densities model a system's behavior under interventions.
We show that they generalize to unseen interventions on their variables, often better than classical approaches.
Our inference method is based on a new theoretical result that expresses a stationarity condition on the diffusion's generator in a reproducing kernel Hilbert space.
arXiv Detail & Related papers (2023-10-26T14:01:17Z) - Reservoir Computing with Error Correction: Long-term Behaviors of
Stochastic Dynamical Systems [5.815325960286111]
We propose a data-driven framework combining Reservoir Computing and Normalizing Flow to study this issue.
We verify the effectiveness of the proposed framework in several experiments, including the Van der Pal, El Nino-Southern Oscillation simplified model, and Lorenz system.
arXiv Detail & Related papers (2023-05-01T05:50:17Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Discrete Lagrangian Neural Networks with Automatic Symmetry Discovery [3.06483729892265]
We introduce a framework to learn a discrete Lagrangian along with its symmetry group from discrete observations of motions.
The learning process does not restrict the form of the Lagrangian, does not require velocity or momentum observations or predictions and incorporates a cost term.
arXiv Detail & Related papers (2022-11-20T00:46:33Z) - Slow semiclassical dynamics of a two-dimensional Hubbard model in
disorder-free potentials [77.34726150561087]
We show that introduction of harmonic and spin-dependent linear potentials sufficiently validates fTWA for longer times.
In particular, we focus on a finite two-dimensional system and show that at intermediate linear potential strength, the addition of a harmonic potential and spin dependence of the tilt, results in subdiffusive dynamics.
arXiv Detail & Related papers (2022-10-03T16:51:25Z) - Learning Fine Scale Dynamics from Coarse Observations via Inner
Recurrence [0.0]
Recent work has focused on data-driven learning of the evolution of unknown systems via deep neural networks (DNNs)
This paper presents a computational technique to learn the fine-scale dynamics from such coarsely observed data.
arXiv Detail & Related papers (2022-06-03T20:28:52Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.