Variational autoencoders with latent high-dimensional steady geometric flows for dynamics
- URL: http://arxiv.org/abs/2410.10137v2
- Date: Sun, 20 Oct 2024 08:46:52 GMT
- Title: Variational autoencoders with latent high-dimensional steady geometric flows for dynamics
- Authors: Andrew Gracyk,
- Abstract summary: We develop approaches to variational autoencoders (VAEs) for PDE-type ambient data with regularizing geometric latent dynamics.
We redevelop the VAE framework such that manifold geometries, subject to our geometric flow, are learned in the intermediary latent space developed by encoders and decoders.
We demonstrate, on our datasets of interest, our methods perform at least as well as the traditional VAE, and oftentimes better.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop Riemannian approaches to variational autoencoders (VAEs) for PDE-type ambient data with regularizing geometric latent dynamics, which we refer to as VAE-DLM, or VAEs with dynamical latent manifolds. We redevelop the VAE framework such that manifold geometries, subject to our geometric flow, embedded in Euclidean space are learned in the intermediary latent space developed by encoders and decoders. By tailoring the geometric flow in which the latent space evolves, we induce latent geometric properties of our choosing, which are reflected in empirical performance. We reformulate the traditional evidence lower bound (ELBO) loss with a considerate choice of prior. We develop a linear geometric flow with a steady-state regularizing term. This flow requires only automatic differentiation of one time derivative, and can be solved in moderately high dimensions in a physics-informed approach, allowing more expressive latent representations. We discuss how this flow can be formulated as a gradient flow, and maintains entropy away from metric singularity. This, along with an eigenvalue penalization condition, helps ensure the manifold is sufficiently large in measure, nondegenerate, and a canonical geometry, which contribute to a robust representation. Our methods focus on the modified multi-layer perceptron architecture with tanh activations for the manifold encoder-decoder. We demonstrate, on our datasets of interest, our methods perform at least as well as the traditional VAE, and oftentimes better. Our methods can outperform this and a VAE endowed with our proposed architecture by up to 25% reduction in out-of-distribution (OOD) error and potentially greater. We highlight our method on ambient PDEs whose solutions maintain minimal variation in late times. We provide empirical justification towards how we can improve robust learning for external dynamics with VAEs.
Related papers
- Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - AROMA: Preserving Spatial Structure for Latent PDE Modeling with Local Neural Fields [14.219495227765671]
We present AROMA, a framework designed to enhance the modeling of partial differential equations (PDEs) using local neural fields.
Our flexible encoder-decoder architecture can obtain smooth latent representations of spatial physical fields from a variety of data types.
By employing a diffusion-based formulation, we achieve greater stability and enable longer rollouts compared to conventional MSE training.
arXiv Detail & Related papers (2024-06-04T10:12:09Z) - Ricci flow-guided autoencoders in learning time-dependent dynamics [0.0]
We present a manifold-based autoencoder method for learning dynamics in time, notably partial differential equations (PDEs)
This can be accomplished by simulating Ricci flow in a physics-informed setting, and manifold quantities can be matched so that Ricci flow is empirically achieved.
We present our method on a range of experiments consisting of PDE data that encompasses desirable characteristics such as periodicity and randomness.
arXiv Detail & Related papers (2024-01-26T01:36:48Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Data-driven reduced order modeling of environmental hydrodynamics using
deep autoencoders and neural ODEs [3.4527210650730393]
We investigate employing deep autoencoders for discovering the reduced basis representation.
Test problems we consider include incompressible flow around a cylinder as well as a real-world application of shallow water hydrodynamics in an estuarine system.
arXiv Detail & Related papers (2021-07-06T17:45:37Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks [86.37110868126548]
In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
arXiv Detail & Related papers (2021-02-16T04:07:13Z) - Quasi-symplectic Langevin Variational Autoencoder [7.443843354775884]
Variational autoencoder (VAE) is a very popular and well-investigated generative model in neural learning research.
It is required to deal with the difficulty of building low variance evidence lower bounds (ELBO)
arXiv Detail & Related papers (2020-09-02T12:13:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.