Convolutional Autoencoders, Clustering and POD for Low-dimensional
Parametrization of Navier-Stokes Equations
- URL: http://arxiv.org/abs/2302.01278v2
- Date: Fri, 3 Feb 2023 13:54:04 GMT
- Title: Convolutional Autoencoders, Clustering and POD for Low-dimensional
Parametrization of Navier-Stokes Equations
- Authors: Yongho Kim, Jan Heiland
- Abstract summary: We propose a convolutional autoencoder (CAE) consisting of a nonlinear encoder and an affine linear decoder.
The proposed set of methods is compared to the standard POD approach in two cylinder-wake scenarios modeled by the incompressible Navier-Stokes equations.
- Score: 1.160208922584163
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simulations of large-scale dynamical systems require expensive computations.
Low-dimensional parametrization of high-dimensional states such as Proper
Orthogonal Decomposition (POD) can be a solution to lessen the burdens by
providing a certain compromise between accuracy and model complexity. However,
for really low-dimensional parametrizations (for example for controller design)
linear methods like the POD come to their natural limits so that nonlinear
approaches will be the methods of choice. In this work we propose a
convolutional autoencoder (CAE) consisting of a nonlinear encoder and an affine
linear decoder and consider combinations with k-means clustering for improved
encoding performance. The proposed set of methods is compared to the standard
POD approach in two cylinder-wake scenarios modeled by the incompressible
Navier-Stokes equations.
Related papers
- Joint Transmit and Pinching Beamforming for PASS: Optimization-Based or Learning-Based? [89.05848771674773]
A novel antenna system ()-enabled downlink multi-user multiple-input single-output (MISO) framework is proposed.
It consists of multiple waveguides, which equip numerous low-cost antennas, named (PAs)
The positions of PAs can be reconfigured to both spanning large-scale path and space.
arXiv Detail & Related papers (2025-02-12T18:54:10Z) - Go With the Flow: Fast Diffusion for Gaussian Mixture Models [13.03355083378673]
Schr"odinger Bridges (SB) are diffusion processes that steer, in finite time, a given initial distribution to another final one while minimizing a suitable cost functional.
We propose latentmetrization of a set of SB policies for steering a system from one distribution to another.
We showcase the potential this approach in low-to-dimensional problems such as image-to-image translation in the space of an autoencoder.
arXiv Detail & Related papers (2024-12-12T08:40:22Z) - Deep polytopic autoencoders for low-dimensional linear parameter-varying approximations and nonlinear feedback design [0.9187159782788578]
Polytopic autoencoders provide low-di-men-sion-al parametrizations of states in a polytope.
For nonlinear PDEs, this is readily applied to low-dimensional linear parameter-varying (LPV) approximations.
arXiv Detail & Related papers (2024-03-26T18:57:56Z) - Constrained Optimization via Exact Augmented Lagrangian and Randomized
Iterative Sketching [55.28394191394675]
We develop an adaptive inexact Newton method for equality-constrained nonlinear, nonIBS optimization problems.
We demonstrate the superior performance of our method on benchmark nonlinear problems, constrained logistic regression with data from LVM, and a PDE-constrained problem.
arXiv Detail & Related papers (2023-05-28T06:33:37Z) - A graph convolutional autoencoder approach to model order reduction for
parametrized PDEs [0.8192907805418583]
The present work proposes a framework for nonlinear model order reduction based on a Graph Convolutional Autoencoder (GCA-ROM)
We develop a non-intrusive and data-driven nonlinear reduction approach, exploiting GNNs to encode the reduced manifold and enable fast evaluations of parametrized PDEs.
arXiv Detail & Related papers (2023-05-15T12:01:22Z) - Non-linear Independent Dual System (NIDS) for Discretization-independent
Surrogate Modeling over Complex Geometries [0.0]
Non-linear independent dual system (NIDS) is a deep learning surrogate model for discretization-independent, continuous representation of PDE solutions.
NIDS can be used for prediction over domains with complex, variable geometries and mesh topologies.
Test cases include a vehicle problem with complex geometry and data scarcity, enabled by a training method.
arXiv Detail & Related papers (2021-09-14T23:38:41Z) - Non-intrusive surrogate modeling for parametrized time-dependent PDEs
using convolutional autoencoders [0.0]
We present a non-intrusive surrogate modeling scheme based on machine learning for predictive modeling of complex, systems by parametrized timedependent PDEs.
We use a convolutional autoencoder in conjunction with a feed forward neural network to establish a low-cost and accurate mapping from problem's parametric space to its solution space.
arXiv Detail & Related papers (2021-01-14T11:34:58Z) - StarNet: Gradient-free Training of Deep Generative Models using
Determined System of Linear Equations [47.72653430712088]
We present an approach for training deep generative models based on solving determined systems of linear equations.
A network that uses this approach, called a StarNet, has the following desirable properties.
arXiv Detail & Related papers (2021-01-03T08:06:42Z) - Pushing the Envelope of Rotation Averaging for Visual SLAM [69.7375052440794]
We propose a novel optimization backbone for visual SLAM systems.
We leverage averaging to improve the accuracy, efficiency and robustness of conventional monocular SLAM systems.
Our approach can exhibit up to 10x faster with comparable accuracy against the state-art on public benchmarks.
arXiv Detail & Related papers (2020-11-02T18:02:26Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - On the Encoder-Decoder Incompatibility in Variational Text Modeling and
Beyond [82.18770740564642]
Variational autoencoders (VAEs) combine latent variables with amortized variational inference.
We observe the encoder-decoder incompatibility that leads to poor parameterizations of the data manifold.
We propose Coupled-VAE, which couples a VAE model with a deterministic autoencoder with the same structure.
arXiv Detail & Related papers (2020-04-20T10:34:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.