Machine Learned Calabi-Yau Metrics and Curvature
- URL: http://arxiv.org/abs/2211.09801v3
- Date: Tue, 6 Jun 2023 15:06:15 GMT
- Title: Machine Learned Calabi-Yau Metrics and Curvature
- Authors: Per Berglund, Giorgi Butbaia, Tristan H\"ubsch, Vishnu Jejjala,
Dami\'an Mayorga Pe\~na, Challenger Mishra, Justin Tan
- Abstract summary: Finding Ricci-flat (Calabi-Yau) metrics is a long standing problem in geometry with deep implications for string theory and phenomenology.
A new attack on this problem uses neural networks to engineer approximations to the Calabi-Yau metric within a given K"ahler class.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finding Ricci-flat (Calabi-Yau) metrics is a long standing problem in
geometry with deep implications for string theory and phenomenology. A new
attack on this problem uses neural networks to engineer approximations to the
Calabi-Yau metric within a given K\"ahler class. In this paper we investigate
numerical Ricci-flat metrics over smooth and singular K3 surfaces and
Calabi-Yau threefolds. Using these Ricci-flat metric approximations for the
Cefal\'u family of quartic twofolds and the Dwork family of quintic threefolds,
we study characteristic forms on these geometries. We observe that the
numerical stability of the numerically computed topological characteristic is
heavily influenced by the choice of the neural network model, in particular, we
briefly discuss a different neural network model, namely Spectral networks,
which correctly approximate the topological characteristic of a Calabi-Yau.
Using persistent homology, we show that high curvature regions of the manifolds
form clusters near the singular points. For our neural network approximations,
we observe a Bogomolov--Yau type inequality $3c_2 \geq c_1^2$ and observe an
identity when our geometries have isolated $A_1$ type singularities. We sketch
a proof that $\chi(X~\smallsetminus~\mathrm{Sing}\,{X}) +
2~|\mathrm{Sing}\,{X}| = 24$ also holds for our numerical approximations.
Related papers
- Symbolic Approximations to Ricci-flat Metrics Via Extrinsic Symmetries of Calabi-Yau Hypersurfaces [0.0]
We analyse machine learning approximations to flat metrics of Fermat Calabi-Yau n-folds.
We show that such symmetries uniquely determine the flat metric on certain loci.
We conclude by distilling the ML models to obtain for the first time closed form expressions for Kahler metrics with near-zero scalar curvature.
arXiv Detail & Related papers (2024-12-27T18:19:26Z) - Information-Theoretic Thresholds for Planted Dense Cycles [52.076657911275525]
We study a random graph model for small-world networks which are ubiquitous in social and biological sciences.
For both detection and recovery of the planted dense cycle, we characterize the information-theoretic thresholds in terms of $n$, $tau$, and an edge-wise signal-to-noise ratio $lambda$.
arXiv Detail & Related papers (2024-02-01T03:39:01Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Machine learning detects terminal singularities [49.1574468325115]
Q-Fano varieties are positively curved shapes which have Q-factorial terminal singularities.
Despite their importance, the classification of Q-Fano varieties remains unknown.
In this paper we demonstrate that machine learning can be used to understand this classification.
arXiv Detail & Related papers (2023-10-31T13:51:24Z) - Krylov Complexity in Calabi-Yau Quantum Mechanics [0.0]
We study Krylov complexity in quantum mechanical systems derived from some well-known local toric Calabi-Yau geometries.
We find that for the Calabi-Yau models, the Lanczos coefficients grow slower than linearly for small $n$'s, consistent with the behavior of integrable models.
arXiv Detail & Related papers (2022-12-06T12:32:04Z) - Differential Geometry in Neural Implicits [0.6198237241838558]
We introduce a neural implicit framework that bridges discrete differential geometry of triangle meshes and continuous differential geometry of neural implicit surfaces.
It exploits the differentiable properties of neural networks and the discrete geometry of triangle meshes to approximate them as the zero-level sets of neural implicit functions.
arXiv Detail & Related papers (2022-01-23T13:40:45Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - Length Learning for Planar Euclidean Curves [0.0]
This work focuses on learning the length of planar sampled curves created by a sine waves dataset.
The robustness to additive noise and discretization errors were tested.
arXiv Detail & Related papers (2021-02-03T06:30:03Z) - Neural Network Approximations for Calabi-Yau Metrics [0.0]
We employ techniques from machine learning to deduce numerical flat metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau manifold.
We show that measures that assess the Ricci flatness of the geometry decrease after training by three orders of magnitude.
arXiv Detail & Related papers (2020-12-31T18:47:51Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.