Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions
- URL: http://arxiv.org/abs/2211.06786v2
- Date: Mon, 17 Apr 2023 21:58:59 GMT
- Title: Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions
- Authors: Paolo Conti, Giorgio Gobat, Stefania Fresca, Andrea Manzoni, Attilio
Frangi
- Abstract summary: This work presents a data-driven, non-intrusive framework which combines ROM construction with reduced dynamics identification.
The proposed approach leverages autoencoder neural networks with parametric sparse identification of nonlinear dynamics (SINDy) to construct a low-dimensional dynamical model.
These aim at tracking the evolution of periodic steady-state responses as functions of system parameters, avoiding the computation of the transient phase, and allowing to detect instabilities and bifurcations.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Highly accurate simulations of complex phenomena governed by partial
differential equations (PDEs) typically require intrusive methods and entail
expensive computational costs, which might become prohibitive when
approximating steady-state solutions of PDEs for multiple combinations of
control parameters and initial conditions. Therefore, constructing efficient
reduced order models (ROMs) that enable accurate but fast predictions, while
retaining the dynamical characteristics of the physical phenomenon as
parameters vary, is of paramount importance. In this work, a data-driven,
non-intrusive framework which combines ROM construction with reduced dynamics
identification, is presented. Starting from a limited amount of full order
solutions, the proposed approach leverages autoencoder neural networks with
parametric sparse identification of nonlinear dynamics (SINDy) to construct a
low-dimensional dynamical model. This model can be queried to efficiently
compute full-time solutions at new parameter instances, as well as directly fed
to continuation algorithms. These aim at tracking the evolution of periodic
steady-state responses as functions of system parameters, avoiding the
computation of the transient phase, and allowing to detect instabilities and
bifurcations. Featuring an explicit and parametrized modeling of the reduced
dynamics, the proposed data-driven framework presents remarkable capabilities
to generalize with respect to both time and parameters. Applications to
structural mechanics and fluid dynamics problems illustrate the effectiveness
and accuracy of the proposed method.
Related papers
- ADAM-SINDy: An Efficient Optimization Framework for Parameterized Nonlinear Dynamical System Identification [0.0]
This paper introduces a novel method within the SINDy framework, termed ADAM-SINDy.
ADAM-SINDy synthesizes the strengths of established approaches by employing the ADAM optimization algorithm.
Results demonstrate significant improvements in identifying parameterized dynamical systems.
arXiv Detail & Related papers (2024-10-21T21:36:17Z) - A parametric framework for kernel-based dynamic mode decomposition using deep learning [0.0]
The proposed framework consists of two stages, offline and online.
The online stage leverages those LANDO models to generate new data at a desired time instant.
dimensionality reduction technique is applied to high-dimensional dynamical systems to reduce the computational cost of training.
arXiv Detail & Related papers (2024-09-25T11:13:50Z) - On latent dynamics learning in nonlinear reduced order modeling [0.6249768559720122]
We present the novel mathematical framework of latent dynamics models (LDMs) for reduced order modeling of parameterized nonlinear time-dependent PDEs.
A time-continuous setting is employed to derive error and stability estimates for the LDM approximation of the full order model (FOM) solution.
Deep neural networks approximate the discrete LDM components, while providing a bounded approximation error with respect to the FOM.
arXiv Detail & Related papers (2024-08-27T16:35:06Z) - VENI, VINDy, VICI: a variational reduced-order modeling framework with uncertainty quantification [4.804365706049767]
We present a data-driven, non-intrusive framework for building reduced-order models (ROMs)
In detail, the method consists of Variational SINI to identify the distribution of reduced coordinates.
Once trained offline, the identified model can be queried for new parameter instances and new initial conditions to compute the corresponding full-time solutions.
arXiv Detail & Related papers (2024-05-31T15:16:48Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Extension of Dynamic Mode Decomposition for dynamic systems with
incomplete information based on t-model of optimal prediction [69.81996031777717]
The Dynamic Mode Decomposition has proved to be a very efficient technique to study dynamic data.
The application of this approach becomes problematic if the available data is incomplete because some dimensions of smaller scale either missing or unmeasured.
We consider a first-order approximation of the Mori-Zwanzig decomposition, state the corresponding optimization problem and solve it with the gradient-based optimization method.
arXiv Detail & Related papers (2022-02-23T11:23:59Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.