Neural Latent Geometry Search: Product Manifold Inference via
Gromov-Hausdorff-Informed Bayesian Optimization
- URL: http://arxiv.org/abs/2309.04810v3
- Date: Fri, 27 Oct 2023 14:05:02 GMT
- Title: Neural Latent Geometry Search: Product Manifold Inference via
Gromov-Hausdorff-Informed Bayesian Optimization
- Authors: Haitz Saez de Ocariz Borde, Alvaro Arroyo, Ismael Morales, Ingmar
Posner, Xiaowen Dong
- Abstract summary: We mathematically define this novel formulation and coin it as neural latent geometry search (NLGS)
We propose a novel notion of distance between candidate latent geometries based on the Gromov-Hausdorff distance from metric geometry.
We then design a graph search space based on the notion of smoothness between latent geometries and employ the calculated as an additional inductive bias.
- Score: 21.97865037637575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent research indicates that the performance of machine learning models can
be improved by aligning the geometry of the latent space with the underlying
data structure. Rather than relying solely on Euclidean space, researchers have
proposed using hyperbolic and spherical spaces with constant curvature, or
combinations thereof, to better model the latent space and enhance model
performance. However, little attention has been given to the problem of
automatically identifying the optimal latent geometry for the downstream task.
We mathematically define this novel formulation and coin it as neural latent
geometry search (NLGS). More specifically, we introduce an initial attempt to
search for a latent geometry composed of a product of constant curvature model
spaces with a small number of query evaluations, under some simplifying
assumptions. To accomplish this, we propose a novel notion of distance between
candidate latent geometries based on the Gromov-Hausdorff distance from metric
geometry. In order to compute the Gromov-Hausdorff distance, we introduce a
mapping function that enables the comparison of different manifolds by
embedding them in a common high-dimensional ambient space. We then design a
graph search space based on the notion of smoothness between latent geometries
and employ the calculated distances as an additional inductive bias. Finally,
we use Bayesian optimization to search for the optimal latent geometry in a
query-efficient manner. This is a general method which can be applied to search
for the optimal latent geometry for a variety of models and downstream tasks.
We perform experiments on synthetic and real-world datasets to identify the
optimal latent geometry for multiple machine learning problems.
Related papers
- Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - Gromov-Hausdorff Distances for Comparing Product Manifolds of Model
Spaces [21.97865037637575]
We introduce a novel notion of distance between candidate latent geometries using the Gromov-Hausdorff distance from metric geometry.
We propose using a graph search space that uses the estimated Gromov-Hausdorff distances to search for the optimal latent geometry.
arXiv Detail & Related papers (2023-09-09T11:17:06Z) - Warped geometric information on the optimisation of Euclidean functions [43.43598316339732]
We consider optimisation of a real-valued function defined in a potentially high-dimensional Euclidean space.
We find the function's optimum along a manifold with a warped metric.
Our proposed algorithm, using 3rd-order approximation of geodesics, tends to outperform standard Euclidean gradient-based counterparts.
arXiv Detail & Related papers (2023-08-16T12:08:50Z) - Exploring Data Geometry for Continual Learning [64.4358878435983]
We study continual learning from a novel perspective by exploring data geometry for the non-stationary stream of data.
Our method dynamically expands the geometry of the underlying space to match growing geometric structures induced by new data.
Experiments show that our method achieves better performance than baseline methods designed in Euclidean space.
arXiv Detail & Related papers (2023-04-08T06:35:25Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Finding Geometric Models by Clustering in the Consensus Space [61.65661010039768]
We propose a new algorithm for finding an unknown number of geometric models, e.g., homographies.
We present a number of applications where the use of multiple geometric models improves accuracy.
These include pose estimation from multiple generalized homographies; trajectory estimation of fast-moving objects.
arXiv Detail & Related papers (2021-03-25T14:35:07Z) - High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds [0.0]
We propose to exploit the geometry of non-Euclidean search spaces, which often arise in a variety of domains, to learn structure-preserving mappings.
Our approach features geometry-aware Gaussian processes that jointly learn a nested-manifold embedding and a representation of the objective function in the latent space.
arXiv Detail & Related papers (2020-10-21T11:24:11Z) - Mix Dimension in Poincar\'{e} Geometry for 3D Skeleton-based Action
Recognition [57.98278794950759]
Graph Convolutional Networks (GCNs) have already demonstrated their powerful ability to model the irregular data.
We present a novel spatial-temporal GCN architecture which is defined via the Poincar'e geometry.
We evaluate our method on two current largest scale 3D datasets.
arXiv Detail & Related papers (2020-07-30T18:23:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.