Neural and numerical methods for $\mathrm{G}_2$-structures on contact Calabi-Yau 7-manifolds
- URL: http://arxiv.org/abs/2602.12438v1
- Date: Thu, 12 Feb 2026 21:52:06 GMT
- Title: Neural and numerical methods for $\mathrm{G}_2$-structures on contact Calabi-Yau 7-manifolds
- Authors: Elli Heyes, Edward Hirst, Henrique N. Sá Earp, Tomás S. R. Silva,
- Abstract summary: A framework for approximating $mathrmG$-structure 3-forms on contact Calabi-Yau is presented.<n>Existing neural network models are employed to compute an approximate Ricci-flat on a Calabi-Yau metric threefold. Second, using this metric and the explicit construction of a $mathrmG$-structure on the associated Calabi-Yau link in the 9-sphere, numerical approximations of the 3-form are generated on a large set of sampled points.
- Score: 0.5444242834245083
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A numerical framework for approximating $\mathrm{G}_2$-structure 3-forms on contact Calabi-Yau manifolds is presented. The approach proceeds in three stages: first, existing neural network models are employed to compute an approximate Ricci-flat metric on a Calabi-Yau threefold. Second, using this metric and the explicit construction of a $\mathrm{G}_2$-structure on the associated 7-dimensional Calabi-Yau link in the 9-sphere, numerical approximations of the 3-form are generated on a large set of sampled points. Finally, a dedicated neural architecture is trained to learn the 3-form and its induced Riemannian metric directly from data, validating the learned structure and its torsion via a numerical implementation of the exterior derivative, which may be of independent interest.
Related papers
- Symbolic Approximations to Ricci-flat Metrics Via Extrinsic Symmetries of Calabi-Yau Hypersurfaces [0.0]
We analyse machine learning approximations to flat metrics of Fermat Calabi-Yau n-folds.<n>We show that the flat metric admits a surprisingly compact representation for certain choices of complex structure moduli.<n>We conclude by distilling the ML models to obtain for the first time closed form expressions for Kahler metrics with near-zero scalar curvature.
arXiv Detail & Related papers (2024-12-27T18:19:26Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - Rethinking SO(3)-equivariance with Bilinear Tensor Networks [0.0]
We show that by judicious symmetry breaking, we can efficiently increase the expressiveness of a network operating only on vector and order-2 tensor representations of SO$(2)$.
We demonstrate the method on an important problem from High Energy Physics known as textitb-tagging, where particle jets originating from b-meson decays must be discriminated from an overwhelming QCD background.
arXiv Detail & Related papers (2023-03-20T17:23:15Z) - Machine Learned Calabi-Yau Metrics and Curvature [0.0]
Finding Ricci-flat (Calabi-Yau) metrics is a long standing problem in geometry with deep implications for string theory and phenomenology.
A new attack on this problem uses neural networks to engineer approximations to the Calabi-Yau metric within a given K"ahler class.
arXiv Detail & Related papers (2022-11-17T18:59:03Z) - Learning Smooth Neural Functions via Lipschitz Regularization [92.42667575719048]
We introduce a novel regularization designed to encourage smooth latent spaces in neural fields.
Compared with prior Lipschitz regularized networks, ours is computationally fast and can be implemented in four lines of code.
arXiv Detail & Related papers (2022-02-16T21:24:54Z) - Hyperbolic Lattice for Scalar Field Theory in AdS$_3$ [0.0]
We construct a tessellation of AdS$_3$, by extending the equilateral computation of AdS$$ triangulation on the Poincar'e disk based on the $(2,3,7)$ triangle group.
A Hamiltonian form conducive to the study of dynamics and quantum computing is presented.
arXiv Detail & Related papers (2022-02-07T19:08:02Z) - Neural Network Approximations for Calabi-Yau Metrics [0.0]
We employ techniques from machine learning to deduce numerical flat metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau manifold.
We show that measures that assess the Ricci flatness of the geometry decrease after training by three orders of magnitude.
arXiv Detail & Related papers (2020-12-31T18:47:51Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z) - PUGeo-Net: A Geometry-centric Network for 3D Point Cloud Upsampling [103.09504572409449]
We propose a novel deep neural network based method, called PUGeo-Net, to generate uniform dense point clouds.
Thanks to its geometry-centric nature, PUGeo-Net works well for both CAD models with sharp features and scanned models with rich geometric details.
arXiv Detail & Related papers (2020-02-24T14:13:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.