Neural Network Approximations for Calabi-Yau Metrics
- URL: http://arxiv.org/abs/2012.15821v2
- Date: Wed, 27 Jan 2021 18:57:02 GMT
- Title: Neural Network Approximations for Calabi-Yau Metrics
- Authors: Vishnu Jejjala, Damian Kaloni Mayorga Pena, Challenger Mishra
- Abstract summary: We employ techniques from machine learning to deduce numerical flat metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau manifold.
We show that measures that assess the Ricci flatness of the geometry decrease after training by three orders of magnitude.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Ricci flat metrics for Calabi-Yau threefolds are not known analytically. In
this work, we employ techniques from machine learning to deduce numerical flat
metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau
manifold. This investigation employs a single neural network architecture that
is capable of approximating Ricci flat Kaehler metrics for several Calabi-Yau
manifolds of dimensions two and three. We show that measures that assess the
Ricci flatness of the geometry decrease after training by three orders of
magnitude. This is corroborated on the validation set, where the improvement is
more modest. Finally, we demonstrate that discrete symmetries of manifolds can
be learned in the process of learning the metric.
Related papers
- Symbolic Approximations to Ricci-flat Metrics Via Extrinsic Symmetries of Calabi-Yau Hypersurfaces [0.0]
We analyse machine learning approximations to flat metrics of Fermat Calabi-Yau n-folds.
We show that such symmetries uniquely determine the flat metric on certain loci.
We conclude by distilling the ML models to obtain for the first time closed form expressions for Kahler metrics with near-zero scalar curvature.
arXiv Detail & Related papers (2024-12-27T18:19:26Z) - Calabi-Yau metrics through Grassmannian learning and Donaldson's algorithm [5.158605878911773]
We present a novel approach to obtaining Ricci-flat approximations to K"ahler metrics.
We use gradient descent on the Grassmannian manifold to identify an efficient subspace of sections.
We implement our methods on the Dwork family of threefolds, commenting on the behaviour at different points in moduli space.
arXiv Detail & Related papers (2024-10-15T05:08:43Z) - A singular Riemannian Geometry Approach to Deep Neural Networks III. Piecewise Differentiable Layers and Random Walks on $n$-dimensional Classes [49.32130498861987]
We study the case of non-differentiable activation functions, such as ReLU.
Two recent works introduced a geometric framework to study neural networks.
We illustrate our findings with some numerical experiments on classification of images and thermodynamic problems.
arXiv Detail & Related papers (2024-04-09T08:11:46Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Third quantization of open quantum systems: new dissipative symmetries
and connections to phase-space and Keldysh field theory formulations [77.34726150561087]
We reformulate the technique of third quantization in a way that explicitly connects all three methods.
We first show that our formulation reveals a fundamental dissipative symmetry present in all quadratic bosonic or fermionic Lindbladians.
For bosons, we then show that the Wigner function and the characteristic function can be thought of as ''wavefunctions'' of the density matrix.
arXiv Detail & Related papers (2023-02-27T18:56:40Z) - Machine Learned Calabi-Yau Metrics and Curvature [0.0]
Finding Ricci-flat (Calabi-Yau) metrics is a long standing problem in geometry with deep implications for string theory and phenomenology.
A new attack on this problem uses neural networks to engineer approximations to the Calabi-Yau metric within a given K"ahler class.
arXiv Detail & Related papers (2022-11-17T18:59:03Z) - Neural Bregman Divergences for Distance Learning [60.375385370556145]
We propose a new approach to learning arbitrary Bregman divergences in a differentiable manner via input convex neural networks.
We show that our method more faithfully learns divergences over a set of both new and previously studied tasks.
Our tests further extend to known asymmetric, but non-Bregman tasks, where our method still performs competitively despite misspecification.
arXiv Detail & Related papers (2022-06-09T20:53:15Z) - Calabi-Yau Metrics, Energy Functionals and Machine-Learning [0.0]
We show that machine learning is able to predict the K"ahler potential of a Calabi-Yau metric having seen only a small sample of training data.
arXiv Detail & Related papers (2021-12-20T21:30:06Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.