SINDy-PI: A Robust Algorithm for Parallel Implicit Sparse Identification
of Nonlinear Dynamics
- URL: http://arxiv.org/abs/2004.02322v2
- Date: Tue, 29 Sep 2020 23:33:14 GMT
- Title: SINDy-PI: A Robust Algorithm for Parallel Implicit Sparse Identification
of Nonlinear Dynamics
- Authors: Kadierdan Kaheman, J.Nathan Kutz, Steven L. Brunton
- Abstract summary: We develop a robust variant of the SINDy algorithm to identify implicit dynamics and rational nonlinearities.
We show that the proposed approach is several orders of magnitude more noise robust than previous approaches.
- Score: 4.996878640124385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurately modeling the nonlinear dynamics of a system from measurement data
is a challenging yet vital topic. The sparse identification of nonlinear
dynamics (SINDy) algorithm is one approach to discover dynamical systems models
from data. Although extensions have been developed to identify implicit
dynamics, or dynamics described by rational functions, these extensions are
extremely sensitive to noise. In this work, we develop SINDy-PI (parallel,
implicit), a robust variant of the SINDy algorithm to identify implicit
dynamics and rational nonlinearities. The SINDy-PI framework includes multiple
optimization algorithms and a principled approach to model selection. We
demonstrate the ability of this algorithm to learn implicit ordinary and
partial differential equations and conservation laws from limited and noisy
data. In particular, we show that the proposed approach is several orders of
magnitude more noise robust than previous approaches, and may be used to
identify a class of complex ODE and PDE dynamics that were previously
unattainable with SINDy, including for the double pendulum dynamics and the
Belousov Zhabotinsky (BZ) reaction.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - ADAM-SINDy: An Efficient Optimization Framework for Parameterized Nonlinear Dynamical System Identification [0.0]
This paper introduces a novel method within the SINDy framework, termed ADAM-SINDy.
ADAM-SINDy synthesizes the strengths of established approaches by employing the ADAM optimization algorithm.
Results demonstrate significant improvements in identifying parameterized dynamical systems.
arXiv Detail & Related papers (2024-10-21T21:36:17Z) - Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems [3.1484174280822845]
We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems.
Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs.
arXiv Detail & Related papers (2024-10-02T23:00:00Z) - Probabilistic Decomposed Linear Dynamical Systems for Robust Discovery of Latent Neural Dynamics [5.841659874892801]
Time-varying linear state-space models are powerful tools for obtaining mathematically interpretable representations of neural signals.
Existing methods for latent variable estimation are not robust to dynamical noise and system nonlinearity.
We propose a probabilistic approach to latent variable estimation in decomposed models that improves robustness against dynamical noise.
arXiv Detail & Related papers (2024-08-29T18:58:39Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Bayesian Spline Learning for Equation Discovery of Nonlinear Dynamics
with Quantified Uncertainty [8.815974147041048]
We develop a novel framework to identify parsimonious governing equations of nonlinear (spatiotemporal) dynamics from sparse, noisy data with quantified uncertainty.
The proposed algorithm is evaluated on multiple nonlinear dynamical systems governed by canonical ordinary and partial differential equations.
arXiv Detail & Related papers (2022-10-14T20:37:36Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Active Learning for Nonlinear System Identification with Guarantees [102.43355665393067]
We study a class of nonlinear dynamical systems whose state transitions depend linearly on a known feature embedding of state-action pairs.
We propose an active learning approach that achieves this by repeating three steps: trajectory planning, trajectory tracking, and re-estimation of the system from all available data.
We show that our method estimates nonlinear dynamical systems at a parametric rate, similar to the statistical rate of standard linear regression.
arXiv Detail & Related papers (2020-06-18T04:54:11Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.