Learning Stochastic Behaviour from Aggregate Data
- URL: http://arxiv.org/abs/2002.03513v7
- Date: Mon, 7 Jun 2021 13:41:35 GMT
- Title: Learning Stochastic Behaviour from Aggregate Data
- Authors: Shaojun Ma, Shu Liu, Hongyuan Zha, Haomin Zhou
- Abstract summary: Learning nonlinear dynamics from aggregate data is a challenging problem because the full trajectory of each individual is not available.
We propose a novel method using the weak form of Fokker Planck Equation (FPE) to describe the density evolution of data in a sampled form.
In such a sample-based framework we are able to learn the nonlinear dynamics from aggregate data without explicitly solving the partial differential equation (PDE) FPE.
- Score: 52.012857267317784
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning nonlinear dynamics from aggregate data is a challenging problem
because the full trajectory of each individual is not available, namely, the
individual observed at one time may not be observed at the next time point, or
the identity of individual is unavailable. This is in sharp contrast to
learning dynamics with full trajectory data, on which the majority of existing
methods are based. We propose a novel method using the weak form of Fokker
Planck Equation (FPE) -- a partial differential equation -- to describe the
density evolution of data in a sampled form, which is then combined with
Wasserstein generative adversarial network (WGAN) in the training process. In
such a sample-based framework we are able to learn the nonlinear dynamics from
aggregate data without explicitly solving the partial differential equation
(PDE) FPE. We demonstrate our approach in the context of a series of synthetic
and real-world data sets.
Related papers
- Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Dynamic Latent Separation for Deep Learning [67.62190501599176]
A core problem in machine learning is to learn expressive latent variables for model prediction on complex data.
Here, we develop an approach that improves expressiveness, provides partial interpretation, and is not restricted to specific applications.
arXiv Detail & Related papers (2022-10-07T17:56:53Z) - $Φ$-DVAE: Physics-Informed Dynamical Variational Autoencoders for Unstructured Data Assimilation [3.2873782624127843]
We develop a physics-informed dynamical variational autoencoder ($Phi$-DVAE) to embed diverse data streams into time-evolving physical systems.
Our approach combines a standard, possibly nonlinear, filter for the latent state-space model and a VAE, to assimilate the unstructured data into the latent dynamical system.
A variational Bayesian framework is used for the joint estimation of the encoding, latent states, and unknown system parameters.
arXiv Detail & Related papers (2022-09-30T17:34:48Z) - Discovering stochastic dynamical equations from biological time series data [0.0]
We present an equation discovery that takes time series data of variables as input and outputs a differential state equation.
We show that we can recover the correct equations, and thus infer the structure of their stability, accurately from the analysis of time series data alone.
We demonstrate our method on two realworld datasets -- fish schooling and single-cell migration.
arXiv Detail & Related papers (2022-05-05T13:44:24Z) - Learn from Unpaired Data for Image Restoration: A Variational Bayes
Approach [18.007258270845107]
We propose LUD-VAE, a deep generative method to learn the joint probability density function from data sampled from marginal distributions.
We apply our method to real-world image denoising and super-resolution tasks and train the models using the synthetic data generated by the LUD-VAE.
arXiv Detail & Related papers (2022-04-21T13:27:17Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Imitating Deep Learning Dynamics via Locally Elastic Stochastic
Differential Equations [20.066631203802302]
We study the evolution of features during deep learning training using a set of differential equations (SDEs) that each corresponds to a training sample.
Our results shed light on the decisive role of local elasticity in the training dynamics of neural networks.
arXiv Detail & Related papers (2021-10-11T17:17:20Z) - Discovery of Nonlinear Dynamical Systems using a Runge-Kutta Inspired
Dictionary-based Sparse Regression Approach [9.36739413306697]
We blend machine learning and dictionary-based learning with numerical analysis tools to discover governing differential equations.
We obtain interpretable and parsimonious models which are prone to generalize better beyond the sampling regime.
We discuss its extension to governing equations, containing rational nonlinearities that typically appear in biological networks.
arXiv Detail & Related papers (2021-05-11T08:46:51Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.