Neural McKean-Vlasov Processes: Distributional Dependence in Diffusion Processes
- URL: http://arxiv.org/abs/2404.09402v1
- Date: Mon, 15 Apr 2024 01:28:16 GMT
- Title: Neural McKean-Vlasov Processes: Distributional Dependence in Diffusion Processes
- Authors: Haoming Yang, Ali Hasan, Yuting Ng, Vahid Tarokh,
- Abstract summary: McKean-Vlasov differential equations (MV-SDEs) provide a mathematical description of the behavior of an infinite number of interacting particles.
We study the influence of explicitly including distributional information in the parameterization of the SDE.
- Score: 24.24785205800212
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: McKean-Vlasov stochastic differential equations (MV-SDEs) provide a mathematical description of the behavior of an infinite number of interacting particles by imposing a dependence on the particle density. As such, we study the influence of explicitly including distributional information in the parameterization of the SDE. We propose a series of semi-parametric methods for representing MV-SDEs, and corresponding estimators for inferring parameters from data based on the properties of the MV-SDE. We analyze the characteristics of the different architectures and estimators, and consider their applicability in relevant machine learning problems. We empirically compare the performance of the different architectures and estimators on real and synthetic datasets for time series and probabilistic modeling. The results suggest that explicitly including distributional dependence in the parameterization of the SDE is effective in modeling temporal data with interaction under an exchangeability assumption while maintaining strong performance for standard It\^o-SDEs due to the richer class of probability flows associated with MV-SDEs.
Related papers
- Identifying Drift, Diffusion, and Causal Structure from Temporal Snapshots [10.018568337210876]
We present the first comprehensive approach for jointly estimating the drift and diffusion of an SDE from its temporal marginals.
We show that each of these steps areAlterally optimal with respect to the Kullback-Leibler datasets.
arXiv Detail & Related papers (2024-10-30T06:28:21Z) - Estimating the Distribution of Parameters in Differential Equations with Repeated Cross-Sectional Data [5.79648227233365]
In economy, politics, and biology, observation data points in the time series are often independently obtained.
Traditional methods for parameter estimation in differential equations have limitations in estimating the shape of parameter distributions.
We introduce a novel method, Estimation of.
EPD, providing accurate distribution of parameters without loss of data information.
arXiv Detail & Related papers (2024-04-23T10:01:43Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Deep learning-based estimation of time-dependent parameters in Markov
models with application to nonlinear regression and SDEs [0.0]
We present a novel deep learning method for estimating time-dependent parameters in Markov processes through discrete sampling.
Our work contributes to SDE-based model parameter estimation, offering a versatile tool for diverse fields.
arXiv Detail & Related papers (2023-12-13T20:13:38Z) - Learning Space-Time Continuous Neural PDEs from Partially Observed
States [13.01244901400942]
We introduce a grid-independent model learning partial differential equations (PDEs) from noisy and partial observations on irregular grids.
We propose a space-time continuous latent neural PDE model with an efficient probabilistic framework and a novel design encoder for improved data efficiency and grid independence.
arXiv Detail & Related papers (2023-07-09T06:53:59Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Pseudo-Spherical Contrastive Divergence [119.28384561517292]
We propose pseudo-spherical contrastive divergence (PS-CD) to generalize maximum learning likelihood of energy-based models.
PS-CD avoids the intractable partition function and provides a generalized family of learning objectives.
arXiv Detail & Related papers (2021-11-01T09:17:15Z) - Data-Driven Theory-guided Learning of Partial Differential Equations
using SimultaNeous Basis Function Approximation and Parameter Estimation
(SNAPE) [0.0]
We propose a technique of parameter estimation of partial differential equations (PDEs) that is robust against high levels of noise 100 %.
SNAPE not only demonstrates its applicability on various complex dynamic systems that encompass wide scientific domains.
The method systematically combines the knowledge of well-established scientific theories and the concepts of data science to infer the properties of the process from the observed data.
arXiv Detail & Related papers (2021-09-14T22:54:30Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.