Rapid Bayesian identification of sparse nonlinear dynamics from scarce and noisy data
- URL: http://arxiv.org/abs/2402.15357v2
- Date: Mon, 23 Sep 2024 17:51:21 GMT
- Title: Rapid Bayesian identification of sparse nonlinear dynamics from scarce and noisy data
- Authors: Lloyd Fung, Urban Fasel, Matthew P. Juniper,
- Abstract summary: We recast the SINDy method within a Bayesian framework and use Gaussian approximations for the prior and likelihood to speed up computation.
The resulting method, Bayesian-SINDy, quantifies uncertainty in the parameters estimated but also is more robust when learning the correct model from limited and noisy data.
- Score: 2.3018169548556977
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a fast probabilistic framework for identifying differential equations governing the dynamics of observed data. We recast the SINDy method within a Bayesian framework and use Gaussian approximations for the prior and likelihood to speed up computation. The resulting method, Bayesian-SINDy, not only quantifies uncertainty in the parameters estimated but also is more robust when learning the correct model from limited and noisy data. Using both synthetic and real-life examples such as Lynx-Hare population dynamics, we demonstrate the effectiveness of the new framework in learning correct model equations and compare its computational and data efficiency with existing methods. Because Bayesian-SINDy can quickly assimilate data and is robust against noise, it is particularly suitable for biological data and real-time system identification in control. Its probabilistic framework also enables the calculation of information entropy, laying the foundation for an active learning strategy.
Related papers
- Discovery and inversion of the viscoelastic wave equation in inhomogeneous media [3.6864706261549127]
Current sparse regression methods may identify inaccurate equations on sparse and noisy datasets.
We propose a hybrid framework that combines two alternating direction optimization phases: discovery and embedding.
The proposed method exhibits excellent robustness and accuracy, even when faced with high levels of noise.
arXiv Detail & Related papers (2024-09-27T01:05:45Z) - Neural Likelihood Approximation for Integer Valued Time Series Data [0.0]
We construct a neural likelihood approximation that can be trained using unconditional simulation of the underlying model.
We demonstrate our method by performing inference on a number of ecological and epidemiological models.
arXiv Detail & Related papers (2023-10-19T07:51:39Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Bayesian Spline Learning for Equation Discovery of Nonlinear Dynamics
with Quantified Uncertainty [8.815974147041048]
We develop a novel framework to identify parsimonious governing equations of nonlinear (spatiotemporal) dynamics from sparse, noisy data with quantified uncertainty.
The proposed algorithm is evaluated on multiple nonlinear dynamical systems governed by canonical ordinary and partial differential equations.
arXiv Detail & Related papers (2022-10-14T20:37:36Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Probabilistic robust linear quadratic regulators with Gaussian processes [73.0364959221845]
Probabilistic models such as Gaussian processes (GPs) are powerful tools to learn unknown dynamical systems from data for subsequent use in control design.
We present a novel controller synthesis for linearized GP dynamics that yields robust controllers with respect to a probabilistic stability margin.
arXiv Detail & Related papers (2021-05-17T08:36:18Z) - DEALIO: Data-Efficient Adversarial Learning for Imitation from
Observation [57.358212277226315]
In imitation learning from observation IfO, a learning agent seeks to imitate a demonstrating agent using only observations of the demonstrated behavior without access to the control signals generated by the demonstrator.
Recent methods based on adversarial imitation learning have led to state-of-the-art performance on IfO problems, but they typically suffer from high sample complexity due to a reliance on data-inefficient, model-free reinforcement learning algorithms.
This issue makes them impractical to deploy in real-world settings, where gathering samples can incur high costs in terms of time, energy, and risk.
We propose a more data-efficient IfO algorithm
arXiv Detail & Related papers (2021-03-31T23:46:32Z) - Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data [4.996878640124385]
SINDy is a framework for the discovery of parsimonious dynamic models and equations from time-series data.
We develop a variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained by Rudy et al.
We show the method can identify a diversity of probability distributions including Gaussian, uniform, Gamma, and Rayleigh.
arXiv Detail & Related papers (2020-09-12T23:52:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.