Nonlocal Kramers-Moyal formulas and data-driven discovery of stochastic dynamical systems with multiplicative Lévy noise
- URL: http://arxiv.org/abs/2601.19223v1
- Date: Tue, 27 Jan 2026 05:44:50 GMT
- Title: Nonlocal Kramers-Moyal formulas and data-driven discovery of stochastic dynamical systems with multiplicative Lévy noise
- Authors: Yang Li, Jinqiao Duan,
- Abstract summary: We develop novel data-driven algorithms capable of simultaneously identifying all governing components from data.<n>This work provides a principled and practical toolbox for discovering interpretable SDE models governing complex systems.
- Score: 2.3951444869691594
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditional data-driven methods, effective for deterministic systems or stochastic differential equations (SDEs) with Gaussian noise, fail to handle the discontinuous sample paths and heavy-tailed fluctuations characteristic of Lévy processes, particularly when the noise is state-dependent. To bridge this gap, we establish nonlocal Kramers-Moyal formulas, rigorously generalizing the classical Kramers-Moyal relations to SDEs with multiplicative Lévy noise. These formulas provide a direct link between short-time transition probability densities (or sample path statistics) and the underlying SDE coefficients: the drift vector, diffusion matrix, Lévy jump measure kernel, and Lévy noise intensity functions. Leveraging these theoretical foundations, we develop novel data-driven algorithms capable of simultaneously identifying all governing components from data and establish convergence results and error analysis for the algorithms. We validate the framework through extensive numerical experiments on prototypical systems. This work provides a principled and practical toolbox for discovering interpretable SDE models governing complex systems influenced by discontinuous, heavy-tailed, state-dependent fluctuations, with broad applicability in climate science, neuroscience, epidemiology, finance, and biological physics.
Related papers
- Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - A joint optimization approach to identifying sparse dynamics using least squares kernel collocation [70.13783231186183]
We develop an all-at-once modeling framework for learning systems of ordinary differential equations (ODE) from scarce, partial, and noisy observations of the states.<n>The proposed methodology amounts to a combination of sparse recovery strategies for the ODE over a function library combined with techniques from reproducing kernel Hilbert space (RKHS) theory for estimating the state and discretizing the ODE.
arXiv Detail & Related papers (2025-11-23T18:04:15Z) - Discovering Governing Equations in the Presence of Uncertainty [11.752763800308276]
In this work, we theorize that accounting for system variability together with measurement noise is the key to consistently discover the governing equations underlying dynamical systems.<n>We show that SIP consistently identifies the correct equations by an average of 82% relative to the Sparse Identification Dynamics (SINDy) approach and its variant.
arXiv Detail & Related papers (2025-07-13T18:31:25Z) - A Data-Driven Framework for Discovering Fractional Differential Equations in Complex Systems [8.206685537936078]
This study introduces a stepwise data-driven framework for discovering fractional differential equations (FDEs) directly from data.<n>Our framework applies deep neural networks as surrogate models for denoising and reconstructing sparse and noisy observations.<n>We validate the framework across various datasets, including synthetic anomalous diffusion data and experimental data on the creep behavior of frozen soils.
arXiv Detail & Related papers (2024-12-05T08:38:30Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Signature Kernel Conditional Independence Tests in Causal Discovery for Stochastic Processes [7.103713918313219]
We develop conditional independence (CI) constraints on coordinate processes over selected intervals.<n>We provide a sound and complete causal discovery algorithm, capable of handling both fully and partially observed data.<n>We also propose a flexible, consistent signature kernel-based CI test to infer these constraints from data.
arXiv Detail & Related papers (2024-02-28T16:58:31Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.<n>This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.<n>We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.