An autoencoder-based reduced-order model for eigenvalue problems with
application to neutron diffusion
- URL: http://arxiv.org/abs/2008.10532v1
- Date: Sat, 15 Aug 2020 16:52:26 GMT
- Title: An autoencoder-based reduced-order model for eigenvalue problems with
application to neutron diffusion
- Authors: Toby Phillips, Claire E. Heaney, Paul N. Smith, Christopher C. Pain
- Abstract summary: Using an autoencoder for dimensionality reduction, this paper presents a novel projection-based reduced-order model for eigenvalue problems.
Reduced-order modelling relies on finding suitable basis functions which define a low-dimensional space in which a high-dimensional system is approximated.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Using an autoencoder for dimensionality reduction, this paper presents a
novel projection-based reduced-order model for eigenvalue problems.
Reduced-order modelling relies on finding suitable basis functions which define
a low-dimensional space in which a high-dimensional system is approximated.
Proper orthogonal decomposition (POD) and singular value decomposition (SVD)
are often used for this purpose and yield an optimal linear subspace.
Autoencoders provide a nonlinear alternative to POD/SVD, that may capture, more
efficiently, features or patterns in the high-fidelity model results.
Reduced-order models based on an autoencoder and a novel hybrid
SVD-autoencoder are developed. These methods are compared with the standard
POD-Galerkin approach and are applied to two test cases taken from the field of
nuclear reactor physics.
Related papers
- Bridging Autoencoders and Dynamic Mode Decomposition for Reduced-order Modeling and Control of PDEs [12.204795159651589]
This paper explores a deep autocodingen learning method for reduced-order modeling and control of dynamical systems governed by Ptemporals.
We first show that an objective for learning a linear autoen reduced-order model can be formulated to yield a solution closely resembling the result obtained through the dynamic mode decomposition with control algorithm.
We then extend this linear autoencoding architecture to a deep autocoding framework, enabling the development of a nonlinear reduced-order model.
arXiv Detail & Related papers (2024-09-09T22:56:40Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Polytopic Autoencoders with Smooth Clustering for Reduced-order
Modelling of Flows [1.1279808969568252]
We propose a polytopic autoencoder architecture that includes a lightweight nonlinear encoder, a convex combination decoder, and a smooth clustering network.
We conduct simulations involving two flow scenarios with the incompressible Navier-Stokes equation.
Numerical results demonstrate the guaranteed properties of the model, low reconstruction errors compared to POD, and the improvement in error using a clustering network.
arXiv Detail & Related papers (2024-01-19T10:52:57Z) - Complexity Matters: Rethinking the Latent Space for Generative Modeling [65.64763873078114]
In generative modeling, numerous successful approaches leverage a low-dimensional latent space, e.g., Stable Diffusion.
In this study, we aim to shed light on this under-explored topic by rethinking the latent space from the perspective of model complexity.
arXiv Detail & Related papers (2023-07-17T07:12:29Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Convolutional Autoencoders, Clustering and POD for Low-dimensional
Parametrization of Navier-Stokes Equations [1.160208922584163]
We propose a convolutional autoencoder (CAE) consisting of a nonlinear encoder and an affine linear decoder.
The proposed set of methods is compared to the standard POD approach in two cylinder-wake scenarios modeled by the incompressible Navier-Stokes equations.
arXiv Detail & Related papers (2023-02-02T18:12:08Z) - Numerical Optimizations for Weighted Low-rank Estimation on Language
Model [73.12941276331316]
Singular value decomposition (SVD) is one of the most popular compression methods that approximates a target matrix with smaller matrices.
Standard SVD treats the parameters within the matrix with equal importance, which is a simple but unrealistic assumption.
We show that our method can perform better than current SOTA methods in neural-based language models.
arXiv Detail & Related papers (2022-11-02T00:58:02Z) - Self-Supervised Training with Autoencoders for Visual Anomaly Detection [61.62861063776813]
We focus on a specific use case in anomaly detection where the distribution of normal samples is supported by a lower-dimensional manifold.
We adapt a self-supervised learning regime that exploits discriminative information during training but focuses on the submanifold of normal examples.
We achieve a new state-of-the-art result on the MVTec AD dataset -- a challenging benchmark for visual anomaly detection in the manufacturing domain.
arXiv Detail & Related papers (2022-06-23T14:16:30Z) - Convolutional Autoencoders for Reduced-Order Modeling [0.0]
We create and train convolutional autoencoders that perform nonlinear dimension reduction for the wave and Kuramoto- Shivasinsky equations.
We present training methods independent of full-order model samples and use the manifold least-squares Petrov-Galerkin projection method to define a reduced-order model.
arXiv Detail & Related papers (2021-08-27T18:37:23Z) - Non-intrusive surrogate modeling for parametrized time-dependent PDEs
using convolutional autoencoders [0.0]
We present a non-intrusive surrogate modeling scheme based on machine learning for predictive modeling of complex, systems by parametrized timedependent PDEs.
We use a convolutional autoencoder in conjunction with a feed forward neural network to establish a low-cost and accurate mapping from problem's parametric space to its solution space.
arXiv Detail & Related papers (2021-01-14T11:34:58Z) - Unsupervised Anomaly Detection with Adversarial Mirrored AutoEncoders [51.691585766702744]
We propose a variant of Adversarial Autoencoder which uses a mirrored Wasserstein loss in the discriminator to enforce better semantic-level reconstruction.
We put forward an alternative measure of anomaly score to replace the reconstruction-based metric.
Our method outperforms the current state-of-the-art methods for anomaly detection on several OOD detection benchmarks.
arXiv Detail & Related papers (2020-03-24T08:26:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.