Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data
- URL: http://arxiv.org/abs/2009.08810v2
- Date: Tue, 29 Sep 2020 23:17:17 GMT
- Title: Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data
- Authors: Kadierdan Kaheman, Steven L. Brunton, J. Nathan Kutz
- Abstract summary: SINDy is a framework for the discovery of parsimonious dynamic models and equations from time-series data.
We develop a variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained by Rudy et al.
We show the method can identify a diversity of probability distributions including Gaussian, uniform, Gamma, and Rayleigh.
- Score: 4.996878640124385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The sparse identification of nonlinear dynamics (SINDy) is a regression
framework for the discovery of parsimonious dynamic models and governing
equations from time-series data. As with all system identification methods,
noisy measurements compromise the accuracy and robustness of the model
discovery procedure. In this work, we develop a variant of the SINDy algorithm
that integrates automatic differentiation and recent time-stepping constrained
motivated by Rudy et al. for simultaneously (i) denoising the data, (ii)
learning and parametrizing the noise probability distribution, and (iii)
identifying the underlying parsimonious dynamical system responsible for
generating the time-series data. Thus within an integrated optimization
framework, noise can be separated from signal, resulting in an architecture
that is approximately twice as robust to noise as state-of-the-art methods,
handling as much as 40% noise on a given time-series signal and explicitly
parametrizing the noise probability distribution. We demonstrate this approach
on several numerical examples, from Lotka-Volterra models to the
spatio-temporal Lorenz 96 model. Further, we show the method can identify a
diversity of probability distributions including Gaussian, uniform, Gamma, and
Rayleigh.
Related papers
- Automating the Discovery of Partial Differential Equations in Dynamical Systems [0.0]
We present an extension to the ARGOS framework, ARGOS-RAL, which leverages sparse regression with the recurrent adaptive lasso to identify PDEs automatically.
We rigorously evaluate the performance of ARGOS-RAL in identifying canonical PDEs under various noise levels and sample sizes.
Our results show that ARGOS-RAL effectively and reliably identifies the underlying PDEs from data, outperforming the sequential threshold ridge regression method in most cases.
arXiv Detail & Related papers (2024-04-25T09:23:03Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Automatically identifying ordinary differential equations from data [0.0]
We propose a methodology to identify dynamical laws by integrating denoising techniques to smooth the signal.
We evaluate our method on well-known ordinary differential equations with an ensemble of random initial conditions.
arXiv Detail & Related papers (2023-04-21T18:00:03Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - An end-to-end deep learning approach for extracting stochastic dynamical
systems with $\alpha$-stable L\'evy noise [5.815325960286111]
In this work, we identify dynamical systems driven by $alpha$-stable Levy noise from only random pairwise data.
Our innovations include: (1) designing a deep learning approach to learn both drift and diffusion terms for Levy induced noise with $alpha$ across all values, (2) learning complex multiplicative noise without restrictions on small noise intensity, and (3) proposing an end-to-end complete framework for systems identification.
arXiv Detail & Related papers (2022-01-31T10:51:25Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - A toolkit for data-driven discovery of governing equations in high-noise
regimes [5.025654873456756]
We consider the data-driven discovery of governing equations from time-series data in the limit of high noise.
We offer two primary contributions, both focused on noisy data acquired from a system x' = f(x)
arXiv Detail & Related papers (2021-11-08T23:32:11Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.