Benchmarking sparse system identification with low-dimensional chaos
- URL: http://arxiv.org/abs/2302.10787v1
- Date: Sat, 4 Feb 2023 18:49:52 GMT
- Title: Benchmarking sparse system identification with low-dimensional chaos
- Authors: Alan A. Kaptanoglu and Lanyue Zhang and Zachary G. Nicolaou and Urban
Fasel and Steven L. Brunton
- Abstract summary: We systematically benchmark sparse regression variants by utilizing the dysts standardized database of chaotic systems.
We demonstrate how this open-source tool can be used to quantitatively compare different methods of system identification.
In all cases, we used ensembling to improve the noise robustness of SINDy and provide statistical comparisons.
- Score: 1.5849413067450229
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sparse system identification is the data-driven process of obtaining
parsimonious differential equations that describe the evolution of a dynamical
system, balancing model complexity and accuracy. There has been rapid
innovation in system identification across scientific domains, but there
remains a gap in the literature for large-scale methodological comparisons that
are evaluated on a variety of dynamical systems. In this work, we
systematically benchmark sparse regression variants by utilizing the dysts
standardized database of chaotic systems. In particular, we demonstrate how
this open-source tool can be used to quantitatively compare different methods
of system identification. To illustrate how this benchmark can be utilized, we
perform a large comparison of four algorithms for solving the sparse
identification of nonlinear dynamics (SINDy) optimization problem, finding
strong performance of the original algorithm and a recent mixed-integer
discrete algorithm. In all cases, we used ensembling to improve the noise
robustness of SINDy and provide statistical comparisons. In addition, we show
very compelling evidence that the weak SINDy formulation provides significant
improvements over the traditional method, even on clean data. Lastly, we
investigate how Pareto-optimal models generated from SINDy algorithms depend on
the properties of the equations, finding that the performance shows no
significant dependence on a set of dynamical properties that quantify the
amount of chaos, scale separation, degree of nonlinearity, and the syntactic
complexity.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Automatically identifying ordinary differential equations from data [0.0]
We propose a methodology to identify dynamical laws by integrating denoising techniques to smooth the signal.
We evaluate our method on well-known ordinary differential equations with an ensemble of random initial conditions.
arXiv Detail & Related papers (2023-04-21T18:00:03Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Bayesian Spline Learning for Equation Discovery of Nonlinear Dynamics
with Quantified Uncertainty [8.815974147041048]
We develop a novel framework to identify parsimonious governing equations of nonlinear (spatiotemporal) dynamics from sparse, noisy data with quantified uncertainty.
The proposed algorithm is evaluated on multiple nonlinear dynamical systems governed by canonical ordinary and partial differential equations.
arXiv Detail & Related papers (2022-10-14T20:37:36Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Learning to Assimilate in Chaotic Dynamical Systems [0.0]
We introduce amortized assimilation, a framework for learning to assimilate in dynamical systems from sequences of noisy observations.
We motivate the framework by extending powerful results from self-supervised denoising to the dynamical systems setting through the use of differentiable simulation.
arXiv Detail & Related papers (2021-11-01T16:07:34Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.