Generator Identification for Linear SDEs with Additive and
Multiplicative Noise
- URL: http://arxiv.org/abs/2310.19491v2
- Date: Sun, 21 Jan 2024 22:35:34 GMT
- Title: Generator Identification for Linear SDEs with Additive and
Multiplicative Noise
- Authors: Yuanyuan Wang, Xi Geng, Wei Huang, Biwei Huang, Mingming Gong
- Abstract summary: identifiability conditions are crucial in causal inference using linear SDEs.
We derive a sufficient and necessary condition for identifying the generator of linear SDEs with additive noise.
We offer geometric interpretations of the derived identifiability conditions to enhance their understanding.
- Score: 48.437815378088466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present conditions for identifying the generator of a
linear stochastic differential equation (SDE) from the distribution of its
solution process with a given fixed initial state. These identifiability
conditions are crucial in causal inference using linear SDEs as they enable the
identification of the post-intervention distributions from its observational
distribution. Specifically, we derive a sufficient and necessary condition for
identifying the generator of linear SDEs with additive noise, as well as a
sufficient condition for identifying the generator of linear SDEs with
multiplicative noise. We show that the conditions derived for both types of
SDEs are generic. Moreover, we offer geometric interpretations of the derived
identifiability conditions to enhance their understanding. To validate our
theoretical results, we perform a series of simulations, which support and
substantiate the established findings.
Related papers
- Governing Equation Discovery from Data Based on Differential Invariants [52.2614860099811]
We propose a pipeline for governing equation discovery based on differential invariants.<n>Specifically, we compute the set of differential invariants corresponding to the infinitesimal generators of the symmetry group.<n>Taking DI-SINDy as an example, we demonstrate that its success rate and accuracy in PDE discovery surpass those of other symmetry-informed governing equation discovery methods.
arXiv Detail & Related papers (2025-05-24T17:19:02Z) - Towards Identifiability of Interventional Stochastic Differential Equations [4.249842620609683]
Our results give the first provable bounds for unique recovery of SDE parameters given samples from their stationary distributions.<n>We experimentally validate the recovery of true parameters in synthetic data, and motivated by our theoretical results, demonstrate the advantage of parameterizations with learnable activation functions.
arXiv Detail & Related papers (2025-05-21T20:10:54Z) - Identifying Drift, Diffusion, and Causal Structure from Temporal Snapshots [10.018568337210876]
We present the first comprehensive approach for jointly estimating the drift and diffusion of an SDE from its temporal marginals.
We show that each of these steps areAlterally optimal with respect to the Kullback-Leibler datasets.
arXiv Detail & Related papers (2024-10-30T06:28:21Z) - Adaptation of uncertainty-penalized Bayesian information criterion for parametric partial differential equation discovery [1.1049608786515839]
We introduce an extension of the uncertainty-penalized Bayesian information criterion (UBIC) to solve parametric PDE discovery problems efficiently.
UBIC uses quantified PDE uncertainty over different temporal or spatial points to prevent overfitting in model selection.
We show that our extended UBIC can identify the true number of terms and their varying coefficients accurately, even in the presence of noise.
arXiv Detail & Related papers (2024-08-15T12:10:50Z) - Closure Discovery for Coarse-Grained Partial Differential Equations Using Grid-based Reinforcement Learning [2.9611509639584304]
We propose a systematic approach for identifying closures in under-resolved PDEs using grid-based Reinforcement Learning.
We demonstrate the capabilities and limitations of our framework through numerical solutions of the advection equation and the Burgers' equation.
arXiv Detail & Related papers (2024-02-01T19:41:04Z) - Differentially Private Gradient Flow based on the Sliced Wasserstein Distance [59.1056830438845]
We introduce a novel differentially private generative modeling approach based on a gradient flow in the space of probability measures.
Experiments show that our proposed model can generate higher-fidelity data at a low privacy budget.
arXiv Detail & Related papers (2023-12-13T15:47:30Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Conditional Hybrid GAN for Sequence Generation [56.67961004064029]
We propose a novel conditional hybrid GAN (C-Hybrid-GAN) to solve this issue.
We exploit the Gumbel-Softmax technique to approximate the distribution of discrete-valued sequences.
We demonstrate that the proposed C-Hybrid-GAN outperforms the existing methods in context-conditioned discrete-valued sequence generation.
arXiv Detail & Related papers (2020-09-18T03:52:55Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z) - Smoothness and Stability in GANs [21.01604897837572]
Generative adversarial networks, or GANs, commonly display unstable behavior during training.
We develop a principled theoretical framework for understanding the stability of various types of GANs.
arXiv Detail & Related papers (2020-02-11T03:08:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.