Symplectic model reduction of Hamiltonian systems using data-driven
quadratic manifolds
- URL: http://arxiv.org/abs/2305.15490v2
- Date: Thu, 24 Aug 2023 15:08:50 GMT
- Title: Symplectic model reduction of Hamiltonian systems using data-driven
quadratic manifolds
- Authors: Harsh Sharma, Hongliang Mu, Patrick Buchfink, Rudy Geelen, Silke Glas,
Boris Kramer
- Abstract summary: We present two novel approaches for the symplectic model reduction of high-dimensional Hamiltonian systems.
The addition of quadratic terms to the state approximation, which sits at the heart of the proposed methodologies, enables us to better represent intrinsic low-dimensionality.
- Score: 0.559239450391449
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work presents two novel approaches for the symplectic model reduction of
high-dimensional Hamiltonian systems using data-driven quadratic manifolds.
Classical symplectic model reduction approaches employ linear symplectic
subspaces for representing the high-dimensional system states in a
reduced-dimensional coordinate system. While these approximations respect the
symplectic nature of Hamiltonian systems, linear basis approximations can
suffer from slowly decaying Kolmogorov $N$-width, especially in wave-type
problems, which then requires a large basis size. We propose two different
model reduction methods based on recently developed quadratic manifolds, each
presenting its own advantages and limitations. The addition of quadratic terms
to the state approximation, which sits at the heart of the proposed
methodologies, enables us to better represent intrinsic low-dimensionality in
the problem at hand. Both approaches are effective for issuing predictions in
settings well outside the range of their training data while providing more
accurate solutions than the linear symplectic reduced-order models.
Related papers
- Pushing the Limits of Large Language Model Quantization via the Linearity Theorem [71.3332971315821]
We present a "line theoremarity" establishing a direct relationship between the layer-wise $ell$ reconstruction error and the model perplexity increase due to quantization.
This insight enables two novel applications: (1) a simple data-free LLM quantization method using Hadamard rotations and MSE-optimal grids, dubbed HIGGS, and (2) an optimal solution to the problem of finding non-uniform per-layer quantization levels.
arXiv Detail & Related papers (2024-11-26T15:35:44Z) - FEM-based Neural Networks for Solving Incompressible Fluid Flows and Related Inverse Problems [41.94295877935867]
numerical simulation and optimization of technical systems described by partial differential equations is expensive.
A comparatively new approach in this context is to combine the good approximation properties of neural networks with the classical finite element method.
In this paper, we extend this approach to saddle-point and non-linear fluid dynamics problems, respectively.
arXiv Detail & Related papers (2024-09-06T07:17:01Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - The Convex Landscape of Neural Networks: Characterizing Global Optima
and Stationary Points via Lasso Models [75.33431791218302]
Deep Neural Network Network (DNN) models are used for programming purposes.
In this paper we examine the use of convex neural recovery models.
We show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
We also show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
arXiv Detail & Related papers (2023-12-19T23:04:56Z) - Non-linear manifold ROM with Convolutional Autoencoders and Reduced
Over-Collocation method [0.0]
Non-affine parametric dependencies, nonlinearities and advection-dominated regimes of the model of interest can result in a slow Kolmogorov n-width decay.
We implement the non-linear manifold method introduced by Carlberg et al [37] with hyper-reduction achieved through reduced over-collocation and teacher-student training of a reduced decoder.
We test the methodology on a 2d non-linear conservation law and a 2d shallow water models, and compare the results obtained with a purely data-driven method for which the dynamics is evolved in time with a long-short term memory network
arXiv Detail & Related papers (2022-03-01T11:16:50Z) - Nonlinear proper orthogonal decomposition for convection-dominated flows [0.0]
We propose an end-to-end Galerkin-free model combining autoencoders with long short-term memory networks for dynamics.
Our approach not only improves the accuracy, but also significantly reduces the computational cost of training and testing.
arXiv Detail & Related papers (2021-10-15T18:05:34Z) - Convolutional Autoencoders for Reduced-Order Modeling [0.0]
We create and train convolutional autoencoders that perform nonlinear dimension reduction for the wave and Kuramoto- Shivasinsky equations.
We present training methods independent of full-order model samples and use the manifold least-squares Petrov-Galerkin projection method to define a reduced-order model.
arXiv Detail & Related papers (2021-08-27T18:37:23Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Joint Dimensionality Reduction for Separable Embedding Estimation [43.22422640265388]
Low-dimensional embeddings for data from disparate sources play critical roles in machine learning, multimedia information retrieval, and bioinformatics.
We propose a supervised dimensionality reduction method that learns linear embeddings jointly for two feature vectors representing data of different modalities or data from distinct types of entities.
Our approach compares favorably against other dimensionality reduction methods, and against a state-of-the-art method of bilinear regression for predicting gene-disease associations.
arXiv Detail & Related papers (2021-01-14T08:48:37Z) - Efficient nonlinear manifold reduced order model [0.19116784879310023]
nonlinear manifold ROM (NM-ROM) can better approximate high-fidelity model solutions with a smaller latent space dimension than the LS-ROMs.
Results show that neural networks can learn a more efficient latent space representation on advection-dominated data from 2D Burgers' equations with a high Reynolds number.
arXiv Detail & Related papers (2020-11-13T18:46:21Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.