Latent-space time evolution of non-intrusive reduced-order models using
Gaussian process emulation
- URL: http://arxiv.org/abs/2007.12167v2
- Date: Thu, 15 Oct 2020 08:58:59 GMT
- Title: Latent-space time evolution of non-intrusive reduced-order models using
Gaussian process emulation
- Authors: Romit Maulik, Themistoklis Botsas, Nesar Ramachandra, Lachlan Robert
Mason and Indranil Pan
- Abstract summary: Non-intrusive reduced-order models (ROMs) provide a low-dimensional framework for systems that may be intrinsically high-dimensional.
In bypassing the utilization of an equation-based evolution, it is often seen that the interpretability of the ROM framework suffers.
In this article, we propose the use of a novel latent-space algorithm based on process regression.
- Score: 0.6850683267295248
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-intrusive reduced-order models (ROMs) have recently generated
considerable interest for constructing computationally efficient counterparts
of nonlinear dynamical systems emerging from various domain sciences. They
provide a low-dimensional emulation framework for systems that may be
intrinsically high-dimensional. This is accomplished by utilizing a
construction algorithm that is purely data-driven. It is no surprise,
therefore, that the algorithmic advances of machine learning have led to
non-intrusive ROMs with greater accuracy and computational gains. However, in
bypassing the utilization of an equation-based evolution, it is often seen that
the interpretability of the ROM framework suffers. This becomes more
problematic when black-box deep learning methods are used which are notorious
for lacking robustness outside the physical regime of the observed data. In
this article, we propose the use of a novel latent-space interpolation
algorithm based on Gaussian process regression. Notably, this reduced-order
evolution of the system is parameterized by control parameters to allow for
interpolation in space. The use of this procedure also allows for a continuous
interpretation of time which allows for temporal interpolation. The latter
aspect provides information, with quantified uncertainty, about full-state
evolution at a finer resolution than that utilized for training the ROMs. We
assess the viability of this algorithm for an advection-dominated system given
by the inviscid shallow water equations.
Related papers
- Self-STORM: Deep Unrolled Self-Supervised Learning for Super-Resolution Microscopy [55.2480439325792]
We introduce deep unrolled self-supervised learning, which alleviates the need for such data by training a sequence-specific, model-based autoencoder.
Our proposed method exceeds the performance of its supervised counterparts.
arXiv Detail & Related papers (2024-03-25T17:40:32Z) - Smooth and Sparse Latent Dynamics in Operator Learning with Jerk
Regularization [1.621267003497711]
This paper introduces a continuous operator learning framework that incorporates jagged regularization into the learning of the compressed latent space.
The framework allows for inference at any desired spatial or temporal resolution.
The effectiveness of this framework is demonstrated through a two-dimensional unsteady flow problem governed by the Navier-Stokes equations.
arXiv Detail & Related papers (2024-02-23T22:38:45Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - A Causality-Based Learning Approach for Discovering the Underlying
Dynamics of Complex Systems from Partial Observations with Stochastic
Parameterization [1.2882319878552302]
This paper develops a new iterative learning algorithm for complex turbulent systems with partial observations.
It alternates between identifying model structures, recovering unobserved variables, and estimating parameters.
Numerical experiments show that the new algorithm succeeds in identifying the model structure and providing suitable parameterizations for many complex nonlinear systems.
arXiv Detail & Related papers (2022-08-19T00:35:03Z) - Generalised Latent Assimilation in Heterogeneous Reduced Spaces with
Machine Learning Surrogate Models [10.410970649045943]
We develop a system which combines reduced-order surrogate models with a novel data assimilation technique.
Generalised Latent Assimilation can benefit both the efficiency provided by the reduced-order modelling and the accuracy of data assimilation.
arXiv Detail & Related papers (2022-04-07T15:13:12Z) - Non-linear manifold ROM with Convolutional Autoencoders and Reduced
Over-Collocation method [0.0]
Non-affine parametric dependencies, nonlinearities and advection-dominated regimes of the model of interest can result in a slow Kolmogorov n-width decay.
We implement the non-linear manifold method introduced by Carlberg et al [37] with hyper-reduction achieved through reduced over-collocation and teacher-student training of a reduced decoder.
We test the methodology on a 2d non-linear conservation law and a 2d shallow water models, and compare the results obtained with a purely data-driven method for which the dynamics is evolved in time with a long-short term memory network
arXiv Detail & Related papers (2022-03-01T11:16:50Z) - Deep-HyROMnet: A deep learning-based operator approximation for
hyper-reduction of nonlinear parametrized PDEs [0.0]
We propose a strategy for learning nonlinear ROM operators using deep neural networks (DNNs)
The resulting hyper-reduced order model enhanced by DNNs is referred to as Deep-HyROMnet.
Numerical results show that Deep-HyROMnets are orders of magnitude faster than POD-GalerkinDEIMs, keeping the same level of accuracy.
arXiv Detail & Related papers (2022-02-05T23:45:25Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.