Towards Identifiability of Interventional Stochastic Differential Equations
- URL: http://arxiv.org/abs/2505.15987v2
- Date: Tue, 27 May 2025 13:55:15 GMT
- Title: Towards Identifiability of Interventional Stochastic Differential Equations
- Authors: Aaron Zweig, Zaikang Lin, Elham Azizi, David Knowles,
- Abstract summary: Our results give the first provable bounds for unique recovery of SDE parameters given samples from their stationary distributions.<n>We experimentally validate the recovery of true parameters in synthetic data, and motivated by our theoretical results, demonstrate the advantage of parameterizations with learnable activation functions.
- Score: 4.249842620609683
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study identifiability of stochastic differential equation (SDE) models under multiple interventions. Our results give the first provable bounds for unique recovery of SDE parameters given samples from their stationary distributions. We give tight bounds on the number of necessary interventions for linear SDEs, and upper bounds for nonlinear SDEs in the small noise regime. We experimentally validate the recovery of true parameters in synthetic data, and motivated by our theoretical results, demonstrate the advantage of parameterizations with learnable activation functions.
Related papers
- SING: SDE Inference via Natural Gradients [0.0]
We propose SDE Inference via Natural Gradients (SING) to efficiently exploit the underlying geometry of the model and variational posterior.<n>SING enables fast and reliable inference in latent SDE models by approximating intractable integrals and parallelizing computations in time.<n>We show that SING outperforms prior methods in state inference and drift estimation on a variety of datasets.
arXiv Detail & Related papers (2025-06-21T19:36:11Z) - Governing Equation Discovery from Data Based on Differential Invariants [52.2614860099811]
We propose a pipeline for governing equation discovery based on differential invariants.<n>Specifically, we compute the set of differential invariants corresponding to the infinitesimal generators of the symmetry group.<n>Taking DI-SINDy as an example, we demonstrate that its success rate and accuracy in PDE discovery surpass those of other symmetry-informed governing equation discovery methods.
arXiv Detail & Related papers (2025-05-24T17:19:02Z) - Foundation Inference Models for Stochastic Differential Equations: A Transformer-based Approach for Zero-shot Function Estimation [3.005912045854039]
We introduce FIM-SDE (Foundation Inference Model for SDEs), a transformer-based recognition model capable of performing accurate zero-shot estimation of the drift and diffusion functions of SDEs.<n>We demonstrate that one and the same (pretrained) FIM-SDE achieves robust zero-shot function estimation across a wide range of synthetic and real-world processes.
arXiv Detail & Related papers (2025-02-26T11:04:02Z) - Principled model selection for stochastic dynamics [0.0]
PASTIS is a principled method combining likelihood-estimation statistics with extreme value theory to suppress superfluous parameters.<n>It reliably identifies minimal models, even with low sampling rates or measurement error.<n>It applies to partial differential equations, and applies to ecological networks and reaction-diffusion dynamics.
arXiv Detail & Related papers (2025-01-17T18:23:16Z) - Identifying Drift, Diffusion, and Causal Structure from Temporal Snapshots [10.018568337210876]
APPEX is an iterative algorithm designed to estimate the drift, diffusion, and causal graph of an additive noise SDE, solely from temporal marginals.<n>We show that APPEX iteratively decreases Kullback-Leibler divergence to the true solution, and demonstrate its effectiveness on simulated data from linear additive noise SDEs.
arXiv Detail & Related papers (2024-10-30T06:28:21Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.<n>We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Generator Identification for Linear SDEs with Additive and
Multiplicative Noise [48.437815378088466]
identifiability conditions are crucial in causal inference using linear SDEs.
We derive a sufficient and necessary condition for identifying the generator of linear SDEs with additive noise.
We offer geometric interpretations of the derived identifiability conditions to enhance their understanding.
arXiv Detail & Related papers (2023-10-30T12:28:53Z) - SA-Solver: Stochastic Adams Solver for Fast Sampling of Diffusion Models [63.49229402384349]
Diffusion Probabilistic Models (DPMs) have achieved considerable success in generation tasks.<n>As sampling from DPMs is equivalent to solving diffusion SDE or ODE which is time-consuming, numerous fast sampling methods built upon improved differential equation solvers are proposed.<n>We propose textitSA-r, which is an improved efficient method for solving SDE to generate data with high quality.
arXiv Detail & Related papers (2023-09-10T12:44:54Z) - Latent SDEs on Homogeneous Spaces [9.361372513858043]
We consider the problem of variational Bayesian inference in a latent variable model where a (possibly complex) observed geometric process is governed by the solution of a latent differential equation (SDE)
Experiments demonstrate that a latent SDE of the proposed type can be learned efficiently by means of an existing one-step Euler-Maruyama scheme.
arXiv Detail & Related papers (2023-06-28T14:18:52Z) - A Neural RDE-based model for solving path-dependent PDEs [5.6293920097580665]
The concept of the path-dependent partial differential equation (PPDE) was first introduced in the context of path-dependent derivatives in financial markets.
Compared to the classical PDE, the solution of a PPDE involves an infinite-dimensional spatial variable.
We propose a rough neural differential equation (NRDE)-based model to learn PPDEs, which effectively encodes the path information through the log-signature feature.
arXiv Detail & Related papers (2023-06-01T20:19:41Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.