An end-to-end deep learning approach for extracting stochastic dynamical
systems with $\alpha$-stable L\'evy noise
- URL: http://arxiv.org/abs/2201.13114v1
- Date: Mon, 31 Jan 2022 10:51:25 GMT
- Title: An end-to-end deep learning approach for extracting stochastic dynamical
systems with $\alpha$-stable L\'evy noise
- Authors: Cheng Fang, Yubin Lu, Ting Gao, Jinqiao Duan
- Abstract summary: In this work, we identify dynamical systems driven by $alpha$-stable Levy noise from only random pairwise data.
Our innovations include: (1) designing a deep learning approach to learn both drift and diffusion terms for Levy induced noise with $alpha$ across all values, (2) learning complex multiplicative noise without restrictions on small noise intensity, and (3) proposing an end-to-end complete framework for systems identification.
- Score: 5.815325960286111
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, extracting data-driven governing laws of dynamical systems through
deep learning frameworks has gained a lot of attention in various fields.
Moreover, a growing amount of research work tends to transfer deterministic
dynamical systems to stochastic dynamical systems, especially those driven by
non-Gaussian multiplicative noise. However, lots of log-likelihood based
algorithms that work well for Gaussian cases cannot be directly extended to
non-Gaussian scenarios which could have high error and low convergence issues.
In this work, we overcome some of these challenges and identify stochastic
dynamical systems driven by $\alpha$-stable L\'evy noise from only random
pairwise data. Our innovations include: (1) designing a deep learning approach
to learn both drift and diffusion terms for L\'evy induced noise with $\alpha$
across all values, (2) learning complex multiplicative noise without
restrictions on small noise intensity, (3) proposing an end-to-end complete
framework for stochastic systems identification under a general input data
assumption, that is, $\alpha$-stable random variable. Finally, numerical
experiments and comparisons with the non-local Kramers-Moyal formulas with
moment generating function confirm the effectiveness of our method.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Weak Collocation Regression for Inferring Stochastic Dynamics with
L\'{e}vy Noise [8.15076267771005]
We propose a weak form of the Fokker-Planck (FP) equation for extracting dynamics with L'evy noise.
Our approach can simultaneously distinguish mixed noise types, even in multi-dimensional problems.
arXiv Detail & Related papers (2024-03-13T06:54:38Z) - Optimistic Active Exploration of Dynamical Systems [52.91573056896633]
We develop an algorithm for active exploration called OPAX.
We show how OPAX can be reduced to an optimal control problem that can be solved at each episode.
Our experiments show that OPAX is not only theoretically sound but also performs well for zero-shot planning on novel downstream tasks.
arXiv Detail & Related papers (2023-06-21T16:26:59Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Extracting stochastic dynamical systems with $\alpha$-stable L\'evy
noise from data [14.230182518492311]
We propose a data-driven method to extract systems with $alpha$-stable L'evy noise from short burst data.
More specifically, we first estimate the L'evy jump measure and noise intensity.
Then we approximate the drift coefficient by combining nonlocal Kramers-Moyal formulas with normalizing flows.
arXiv Detail & Related papers (2021-09-30T06:57:42Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Learning based signal detection for MIMO systems with unknown noise
statistics [84.02122699723536]
This paper aims to devise a generalized maximum likelihood (ML) estimator to robustly detect signals with unknown noise statistics.
In practice, there is little or even no statistical knowledge on the system noise, which in many cases is non-Gaussian, impulsive and not analyzable.
Our framework is driven by an unsupervised learning approach, where only the noise samples are required.
arXiv Detail & Related papers (2021-01-21T04:48:15Z) - Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data [4.996878640124385]
SINDy is a framework for the discovery of parsimonious dynamic models and equations from time-series data.
We develop a variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained by Rudy et al.
We show the method can identify a diversity of probability distributions including Gaussian, uniform, Gamma, and Rayleigh.
arXiv Detail & Related papers (2020-09-12T23:52:25Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.