Calabi-Yau Metrics, Energy Functionals and Machine-Learning
- URL: http://arxiv.org/abs/2112.10872v1
- Date: Mon, 20 Dec 2021 21:30:06 GMT
- Title: Calabi-Yau Metrics, Energy Functionals and Machine-Learning
- Authors: Anthony Ashmore, Lucille Calmon, Yang-Hui He, Burt A. Ovrut
- Abstract summary: We show that machine learning is able to predict the K"ahler potential of a Calabi-Yau metric having seen only a small sample of training data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We apply machine learning to the problem of finding numerical Calabi-Yau
metrics. We extend previous work on learning approximate Ricci-flat metrics
calculated using Donaldson's algorithm to the much more accurate "optimal"
metrics of Headrick and Nassar. We show that machine learning is able to
predict the K\"ahler potential of a Calabi-Yau metric having seen only a small
sample of training data.
Related papers
- Calabi-Yau metrics through Grassmannian learning and Donaldson's algorithm [5.158605878911773]
We present a novel approach to obtaining Ricci-flat approximations to K"ahler metrics.
We use gradient descent on the Grassmannian manifold to identify an efficient subspace of sections.
We implement our methods on the Dwork family of threefolds, commenting on the behaviour at different points in moduli space.
arXiv Detail & Related papers (2024-10-15T05:08:43Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Evaluating natural language processing models with generalization
metrics that do not need access to any training or testing data [66.11139091362078]
We provide the first model selection results on large pretrained Transformers from Huggingface using generalization metrics.
Despite their niche status, we find that metrics derived from the heavy-tail (HT) perspective are particularly useful in NLP tasks.
arXiv Detail & Related papers (2022-02-06T20:07:35Z) - Machine Learning Calabi-Yau Hypersurfaces [0.0]
We revisit the classic database of weighted-P4s which admit Calabi-Yau 3-fold hypersurfaces.
Unsupervised techniques identify an unanticipated almost linear dependence of the topological data on the weights.
Supervised techniques are successful in predicting the topological parameters of the hypersurface from its weights with an accuracy of R2 > 95%.
arXiv Detail & Related papers (2021-12-12T23:17:31Z) - Learning Size and Shape of Calabi-Yau Spaces [0.0]
We present a new machine learning library for computing metrics of string compactification spaces.
We benchmark the performance on Monte-Carlo sampled integrals against previous numerical approximations.
arXiv Detail & Related papers (2021-11-02T08:48:53Z) - ALT-MAS: A Data-Efficient Framework for Active Testing of Machine
Learning Algorithms [58.684954492439424]
We propose a novel framework to efficiently test a machine learning model using only a small amount of labeled test data.
The idea is to estimate the metrics of interest for a model-under-test using Bayesian neural network (BNN)
arXiv Detail & Related papers (2021-04-11T12:14:04Z) - Estimating informativeness of samples with Smooth Unique Information [108.25192785062367]
We measure how much a sample informs the final weights and how much it informs the function computed by the weights.
We give efficient approximations of these quantities using a linearized network.
We apply these measures to several problems, such as dataset summarization.
arXiv Detail & Related papers (2021-01-17T10:29:29Z) - Neural Network Approximations for Calabi-Yau Metrics [0.0]
We employ techniques from machine learning to deduce numerical flat metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau manifold.
We show that measures that assess the Ricci flatness of the geometry decrease after training by three orders of magnitude.
arXiv Detail & Related papers (2020-12-31T18:47:51Z) - Learning outside the Black-Box: The pursuit of interpretable models [78.32475359554395]
This paper proposes an algorithm that produces a continuous global interpretation of any given continuous black-box function.
Our interpretation represents a leap forward from the previous state of the art.
arXiv Detail & Related papers (2020-11-17T12:39:44Z) - Provably Robust Metric Learning [98.50580215125142]
We show that existing metric learning algorithms can result in metrics that are less robust than the Euclidean distance.
We propose a novel metric learning algorithm to find a Mahalanobis distance that is robust against adversarial perturbations.
Experimental results show that the proposed metric learning algorithm improves both certified robust errors and empirical robust errors.
arXiv Detail & Related papers (2020-06-12T09:17:08Z) - Supervised Categorical Metric Learning with Schatten p-Norms [10.995886294197412]
We propose a method, called CPML for emphcategorical projected metric learning, to address the problem of metric learning in categorical data.
We make use of the Value Distance Metric to represent our data and propose new distances based on this representation.
We then show how to efficiently learn new metrics.
arXiv Detail & Related papers (2020-02-26T01:17:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.