Convolutional Autoencoders, Clustering and POD for Low-dimensional
Parametrization of Navier-Stokes Equations
- URL: http://arxiv.org/abs/2302.01278v2
- Date: Fri, 3 Feb 2023 13:54:04 GMT
- Title: Convolutional Autoencoders, Clustering and POD for Low-dimensional
Parametrization of Navier-Stokes Equations
- Authors: Yongho Kim, Jan Heiland
- Abstract summary: We propose a convolutional autoencoder (CAE) consisting of a nonlinear encoder and an affine linear decoder.
The proposed set of methods is compared to the standard POD approach in two cylinder-wake scenarios modeled by the incompressible Navier-Stokes equations.
- Score: 1.160208922584163
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simulations of large-scale dynamical systems require expensive computations.
Low-dimensional parametrization of high-dimensional states such as Proper
Orthogonal Decomposition (POD) can be a solution to lessen the burdens by
providing a certain compromise between accuracy and model complexity. However,
for really low-dimensional parametrizations (for example for controller design)
linear methods like the POD come to their natural limits so that nonlinear
approaches will be the methods of choice. In this work we propose a
convolutional autoencoder (CAE) consisting of a nonlinear encoder and an affine
linear decoder and consider combinations with k-means clustering for improved
encoding performance. The proposed set of methods is compared to the standard
POD approach in two cylinder-wake scenarios modeled by the incompressible
Navier-Stokes equations.
Related papers
- Deep polytopic autoencoders for low-dimensional linear parameter-varying approximations and nonlinear feedback design [0.9187159782788578]
We develop a polytopic autoencoder for control applications.
We show how it outperforms standard linear approaches in view of LPV approximations of nonlinear systems.
arXiv Detail & Related papers (2024-03-26T18:57:56Z) - Improving Pseudo-Time Stepping Convergence for CFD Simulations With
Neural Networks [44.99833362998488]
Navier-Stokes equations may exhibit a highly nonlinear behavior.
The system of nonlinear equations resulting from the discretization of the Navier-Stokes equations can be solved using nonlinear iteration methods, such as Newton's method.
In this paper, pseudo-transient continuation is employed in order to improve nonlinear convergence.
arXiv Detail & Related papers (2023-10-10T15:45:19Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - A graph convolutional autoencoder approach to model order reduction for
parametrized PDEs [0.8192907805418583]
The present work proposes a framework for nonlinear model order reduction based on a Graph Convolutional Autoencoder (GCA-ROM)
We develop a non-intrusive and data-driven nonlinear reduction approach, exploiting GNNs to encode the reduced manifold and enable fast evaluations of parametrized PDEs.
arXiv Detail & Related papers (2023-05-15T12:01:22Z) - Non-linear Independent Dual System (NIDS) for Discretization-independent
Surrogate Modeling over Complex Geometries [0.0]
Non-linear independent dual system (NIDS) is a deep learning surrogate model for discretization-independent, continuous representation of PDE solutions.
NIDS can be used for prediction over domains with complex, variable geometries and mesh topologies.
Test cases include a vehicle problem with complex geometry and data scarcity, enabled by a training method.
arXiv Detail & Related papers (2021-09-14T23:38:41Z) - Non-intrusive surrogate modeling for parametrized time-dependent PDEs
using convolutional autoencoders [0.0]
We present a non-intrusive surrogate modeling scheme based on machine learning for predictive modeling of complex, systems by parametrized timedependent PDEs.
We use a convolutional autoencoder in conjunction with a feed forward neural network to establish a low-cost and accurate mapping from problem's parametric space to its solution space.
arXiv Detail & Related papers (2021-01-14T11:34:58Z) - StarNet: Gradient-free Training of Deep Generative Models using
Determined System of Linear Equations [47.72653430712088]
We present an approach for training deep generative models based on solving determined systems of linear equations.
A network that uses this approach, called a StarNet, has the following desirable properties.
arXiv Detail & Related papers (2021-01-03T08:06:42Z) - Pushing the Envelope of Rotation Averaging for Visual SLAM [69.7375052440794]
We propose a novel optimization backbone for visual SLAM systems.
We leverage averaging to improve the accuracy, efficiency and robustness of conventional monocular SLAM systems.
Our approach can exhibit up to 10x faster with comparable accuracy against the state-art on public benchmarks.
arXiv Detail & Related papers (2020-11-02T18:02:26Z) - An autoencoder-based reduced-order model for eigenvalue problems with
application to neutron diffusion [0.0]
Using an autoencoder for dimensionality reduction, this paper presents a novel projection-based reduced-order model for eigenvalue problems.
Reduced-order modelling relies on finding suitable basis functions which define a low-dimensional space in which a high-dimensional system is approximated.
arXiv Detail & Related papers (2020-08-15T16:52:26Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - On the Encoder-Decoder Incompatibility in Variational Text Modeling and
Beyond [82.18770740564642]
Variational autoencoders (VAEs) combine latent variables with amortized variational inference.
We observe the encoder-decoder incompatibility that leads to poor parameterizations of the data manifold.
We propose Coupled-VAE, which couples a VAE model with a deterministic autoencoder with the same structure.
arXiv Detail & Related papers (2020-04-20T10:34:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.