Identifying the latent space geometry of network models through analysis
of curvature
- URL: http://arxiv.org/abs/2012.10559v3
- Date: Tue, 6 Apr 2021 21:47:53 GMT
- Title: Identifying the latent space geometry of network models through analysis
of curvature
- Authors: Shane Lubold, Arun G. Chandrasekhar, Tyler H. McCormick
- Abstract summary: We present a method to consistently estimate the manifold type, dimension, and curvature from an empirically relevant class of latent spaces.
Our core insight comes by representing the graph as a noisy distance matrix based on the ties between cliques.
- Score: 7.644165047073435
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Statistically modeling networks, across numerous disciplines and contexts, is
fundamentally challenging because of (often high-order) dependence between
connections. A common approach assigns each person in the graph to a position
on a low-dimensional manifold. Distance between individuals in this (latent)
space is inversely proportional to the likelihood of forming a connection. The
choice of the latent geometry (the manifold class, dimension, and curvature)
has consequential impacts on the substantive conclusions of the model. More
positive curvature in the manifold, for example, encourages more and tighter
communities; negative curvature induces repulsion among nodes. Currently,
however, the choice of the latent geometry is an a priori modeling assumption
and there is limited guidance about how to make these choices in a data-driven
way. In this work, we present a method to consistently estimate the manifold
type, dimension, and curvature from an empirically relevant class of latent
spaces: simply connected, complete Riemannian manifolds of constant curvature.
Our core insight comes by representing the graph as a noisy distance matrix
based on the ties between cliques. Leveraging results from statistical
geometry, we develop hypothesis tests to determine whether the observed
distances could plausibly be embedded isometrically in each of the candidate
geometries. We explore the accuracy of our approach with simulations and then
apply our approach to data-sets from economics and sociology as well as
neuroscience.
Related papers
- Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Contrastive Graph Clustering in Curvature Spaces [74.03252813800334]
We present a novel end-to-end contrastive graph clustering model named CONGREGATE.
To support geometric clustering, we construct a theoretically grounded Heterogeneous Curvature Space.
We then train the graph clusters by an augmentation-free reweighted contrastive approach.
arXiv Detail & Related papers (2023-05-05T14:04:52Z) - Unveiling the Sampling Density in Non-Uniform Geometric Graphs [69.93864101024639]
We consider graphs as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.
In a social network communities can be modeled as densely sampled areas, and hubs as nodes with larger neighborhood radius.
We develop methods to estimate the unknown sampling density in a self-supervised fashion.
arXiv Detail & Related papers (2022-10-15T08:01:08Z) - Curved Geometric Networks for Visual Anomaly Recognition [39.91252195360767]
Learning a latent embedding to understand the underlying nature of data distribution is often formulated in Euclidean spaces with zero curvature.
In this work, we investigate benefits of the curved space for analyzing anomalies or out-of-distribution objects in data.
arXiv Detail & Related papers (2022-08-02T01:15:39Z) - A Graph-based approach to derive the geodesic distance on Statistical
manifolds: Application to Multimedia Information Retrieval [5.1388648724853825]
We leverage the properties of non-Euclidean Geometry to define the Geodesic distance.
We propose an approximation of the Geodesic distance through a graph-based method.
Our main aim is to compare the graph-based approximation to the state of the art approximations.
arXiv Detail & Related papers (2021-06-26T16:39:54Z) - GELATO: Geometrically Enriched Latent Model for Offline Reinforcement
Learning [54.291331971813364]
offline reinforcement learning approaches can be divided into proximal and uncertainty-aware methods.
In this work, we demonstrate the benefit of combining the two in a latent variational model.
Our proposed metrics measure both the quality of out of distribution samples as well as the discrepancy of examples in the data.
arXiv Detail & Related papers (2021-02-22T19:42:40Z) - Geometry of Similarity Comparisons [51.552779977889045]
We show that the ordinal capacity of a space form is related to its dimension and the sign of its curvature.
More importantly, we show that the statistical behavior of the ordinal spread random variables defined on a similarity graph can be used to identify its underlying space form.
arXiv Detail & Related papers (2020-06-17T13:37:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.