Guaranteed Stable Quadratic Models and their applications in SINDy and
Operator Inference
- URL: http://arxiv.org/abs/2308.13819v2
- Date: Sun, 7 Jan 2024 11:53:37 GMT
- Title: Guaranteed Stable Quadratic Models and their applications in SINDy and
Operator Inference
- Authors: Pawan Goyal and Igor Pontes Duff and Peter Benner
- Abstract summary: We focus on an operator inference methodology that builds dynamical models.
For inference, we aim to learn the operators of a model by setting up an appropriate optimization problem.
We present several numerical examples, illustrating the preservation of stability and discussing its comparison with the existing state-of-the-art approach to infer operators.
- Score: 9.599029891108229
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scientific machine learning for inferring dynamical systems combines
data-driven modeling, physics-based modeling, and empirical knowledge. It plays
an essential role in engineering design and digital twinning. In this work, we
primarily focus on an operator inference methodology that builds dynamical
models, preferably in low-dimension, with a prior hypothesis on the model
structure, often determined by known physics or given by experts. Then, for
inference, we aim to learn the operators of a model by setting up an
appropriate optimization problem. One of the critical properties of dynamical
systems is stability. However, this property is not guaranteed by the inferred
models. In this work, we propose inference formulations to learn quadratic
models, which are stable by design. Precisely, we discuss the parameterization
of quadratic systems that are locally and globally stable. Moreover, for
quadratic systems with no stable point yet bounded (e.g., chaotic Lorenz
model), we discuss how to parameterize such bounded behaviors in the learning
process. Using those parameterizations, we set up inference problems, which are
then solved using a gradient-based optimization method. Furthermore, to avoid
numerical derivatives and still learn continuous systems, we make use of an
integral form of differential equations. We present several numerical examples,
illustrating the preservation of stability and discussing its comparison with
the existing state-of-the-art approach to infer operators. By means of
numerical examples, we also demonstrate how the proposed methods are employed
to discover governing equations and energy-preserving models.
Related papers
- No Equations Needed: Learning System Dynamics Without Relying on Closed-Form ODEs [56.78271181959529]
This paper proposes a conceptual shift to modeling low-dimensional dynamical systems by departing from the traditional two-step modeling process.
Instead of first discovering a closed-form equation and then analyzing it, our approach, direct semantic modeling, predicts the semantic representation of the dynamical system.
Our approach not only simplifies the modeling pipeline but also enhances the transparency and flexibility of the resulting models.
arXiv Detail & Related papers (2025-01-30T18:36:48Z) - Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Parametric model reduction of mean-field and stochastic systems via higher-order action matching [1.1509084774278489]
We learn models of population dynamics of physical systems that feature gradient and mean-field effects.
We show that our approach accurately predicts population dynamics over a wide range of parameters and outperforms state-of-the-art diffusion-based and flow-based modeling.
arXiv Detail & Related papers (2024-10-15T19:05:28Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - Differentiable physics-enabled closure modeling for Burgers' turbulence [0.0]
We discuss an approach using the differentiable physics paradigm that combines known physics with machine learning to develop closure models for turbulence problems.
We train a series of models that incorporate varying degrees of physical assumptions on an a posteriori loss function to test the efficacy of models.
We find that constraining models with inductive biases in the form of partial differential equations that contain known physics or existing closure approaches produces highly data-efficient, accurate, and generalizable models.
arXiv Detail & Related papers (2022-09-23T14:38:01Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Using scientific machine learning for experimental bifurcation analysis
of dynamic systems [2.204918347869259]
This study focuses on training universal differential equation (UDE) models for physical nonlinear dynamical systems with limit cycles.
We consider examples where training data is generated by numerical simulations, whereas we also employ the proposed modelling concept to physical experiments.
We use both neural networks and Gaussian processes as universal approximators alongside the mechanistic models to give a critical assessment of the accuracy and robustness of the UDE modelling approach.
arXiv Detail & Related papers (2021-10-22T15:43:03Z) - Approximate Latent Force Model Inference [1.3927943269211591]
latent force models offer an interpretable alternative to purely data driven tools for inference in dynamical systems.
We show that a neural operator approach can scale our model to thousands of instances, enabling fast, distributed computation.
arXiv Detail & Related papers (2021-09-24T09:55:00Z) - Physics-informed regularization and structure preservation for learning
stable reduced models from data with operator inference [0.0]
Operator inference learns low-dimensional dynamical-system models with nonlinear terms from trajectories of high-dimensional physical systems.
A regularizer for operator inference that induces a stability bias onto quadratic models is proposed.
A formulation of operator inference is proposed that enforces model constraints for preserving structure.
arXiv Detail & Related papers (2021-07-06T13:15:54Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.