Stochastic embeddings of dynamical phenomena through variational
autoencoders
- URL: http://arxiv.org/abs/2010.06265v1
- Date: Tue, 13 Oct 2020 10:10:24 GMT
- Title: Stochastic embeddings of dynamical phenomena through variational
autoencoders
- Authors: Constantino A. Garcia, Paulo Felix, Jesus M. Presedo, Abraham Otero
- Abstract summary: We use a recognition network to increase the observed space dimensionality during the reconstruction of the phase space.
Our validation shows that this approach not only recovers a state space that resembles the original one, but it is also able to synthetize new time series.
- Score: 1.7205106391379026
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: System identification in scenarios where the observed number of variables is
less than the degrees of freedom in the dynamics is an important challenge. In
this work we tackle this problem by using a recognition network to increase the
observed space dimensionality during the reconstruction of the phase space. The
phase space is forced to have approximately Markovian dynamics described by a
Stochastic Differential Equation (SDE), which is also to be discovered. To
enable robust learning from stochastic data we use the Bayesian paradigm and
place priors on the drift and diffusion terms. To handle the complexity of
learning the posteriors, a set of mean field variational approximations to the
true posteriors are introduced, enabling efficient statistical inference.
Finally, a decoder network is used to obtain plausible reconstructions of the
experimental data. The main advantage of this approach is that the resulting
model is interpretable within the paradigm of statistical physics. Our
validation shows that this approach not only recovers a state space that
resembles the original one, but it is also able to synthetize new time series
capturing the main properties of the experimental data.
Related papers
- Dreaming Learning [41.94295877935867]
Introducing new information to a machine learning system can interfere with previously stored data.
We propose a training algorithm inspired by Stuart Kauffman's notion of the Adjacent Possible.
It predisposes the neural network to smoothly accept and integrate data sequences with different statistical characteristics than expected.
arXiv Detail & Related papers (2024-10-23T09:17:31Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Dynamical Regimes of Diffusion Models [14.797301819675454]
We study generative diffusion models in the regime where the dimension of space and the number of data are large.
Our analysis reveals three distinct dynamical regimes during the backward generative diffusion process.
The dependence of the collapse time on the dimension and number of data provides a thorough characterization of the curse of dimensionality for diffusion models.
arXiv Detail & Related papers (2024-02-28T17:19:26Z) - Tipping Points of Evolving Epidemiological Networks: Machine
Learning-Assisted, Data-Driven Effective Modeling [0.0]
We study the tipping point collective dynamics of an adaptive susceptible-infected (SIS) epidemiological network in a data-driven, machine learning-assisted manner.
We identify a complex effective differential equation (eSDE) in terms physically meaningful coarse mean-field variables.
We study the statistics of rare events both through repeated brute force simulations and by using established mathematical/computational tools.
arXiv Detail & Related papers (2023-11-01T19:33:03Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Posterior Collapse and Latent Variable Non-identifiability [54.842098835445]
We propose a class of latent-identifiable variational autoencoders, deep generative models which enforce identifiability without sacrificing flexibility.
Across synthetic and real datasets, latent-identifiable variational autoencoders outperform existing methods in mitigating posterior collapse and providing meaningful representations of the data.
arXiv Detail & Related papers (2023-01-02T06:16:56Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Fast Estimation of Bayesian State Space Models Using Amortized
Simulation-Based Inference [0.0]
This paper presents a fast algorithm for estimating hidden states of Bayesian state space models.
After pretraining, finding the posterior distribution for any dataset takes from hundredths to tenths of a second.
arXiv Detail & Related papers (2022-10-13T16:37:05Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.