Representations, Metrics and Statistics For Shape Analysis of Elastic
Graphs
- URL: http://arxiv.org/abs/2003.00287v2
- Date: Fri, 15 May 2020 00:34:06 GMT
- Title: Representations, Metrics and Statistics For Shape Analysis of Elastic
Graphs
- Authors: Xiaoyang Guo, Anuj Srivastava
- Abstract summary: This paper introduces a far-reaching geometric approach for analyzing shapes of graphical objects, such as road networks, blood vessels, brain fiber tracts, etc.
It represents such objects, exhibiting differences in both geometries and topologies, as graphs made of curves with arbitrary shapes (edges) and connected at arbitrary junctions (nodes)
- Score: 21.597624908203805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Past approaches for statistical shape analysis of objects have focused mainly
on objects within the same topological classes, e.g., scalar functions,
Euclidean curves, or surfaces, etc. For objects that differ in more complex
ways, the current literature offers only topological methods. This paper
introduces a far-reaching geometric approach for analyzing shapes of graphical
objects, such as road networks, blood vessels, brain fiber tracts, etc. It
represents such objects, exhibiting differences in both geometries and
topologies, as graphs made of curves with arbitrary shapes (edges) and
connected at arbitrary junctions (nodes). To perform statistical analyses, one
needs mathematical representations, metrics and other geometrical tools, such
as geodesics, means, and covariances. This paper utilizes a quotient structure
to develop efficient algorithms for computing these quantities, leading to
useful statistical tools, including principal component analysis and analytical
statistical testing and modeling of graphical shapes. The efficacy of this
framework is demonstrated using various simulated as well as the real data from
neurons and brain arterial networks.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Geometry of the Space of Partitioned Networks: A Unified Theoretical and Computational Framework [3.69102525133732]
"Space of networks" has a complex structure that cannot be adequately described using conventional statistical tools.
We introduce a measure-theoretic formalism for modeling generalized network structures such as graphs, hypergraphs, or graphs whose nodes come with a partition into categorical classes.
We show that our metric is an Alexandrov space of non-negative curvature, and leverage this structure to define gradients for certain functionals commonly arising in geometric data analysis tasks.
arXiv Detail & Related papers (2024-09-10T07:58:37Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Topological Parallax: A Geometric Specification for Deep Perception
Models [0.778001492222129]
We introduce topological parallax as a theoretical and computational tool that compares a trained model to a reference dataset.
Our examples show that this geometric similarity between dataset and model is essential to trustworthy and perturbation.
This new concept will add value to the current debate regarding the unclear relationship between overfitting and generalization in applications of deep-learning.
arXiv Detail & Related papers (2023-06-20T18:45:24Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - From axioms over graphs to vectors, and back again: evaluating the
properties of graph-based ontology embeddings [78.217418197549]
One approach to generating embeddings is by introducing a set of nodes and edges for named entities and logical axioms structure.
Methods that embed in graphs (graph projections) have different properties related to the type of axioms they utilize.
arXiv Detail & Related papers (2023-03-29T08:21:49Z) - Towards a mathematical understanding of learning from few examples with
nonlinear feature maps [68.8204255655161]
We consider the problem of data classification where the training set consists of just a few data points.
We reveal key relationships between the geometry of an AI model's feature space, the structure of the underlying data distributions, and the model's generalisation capabilities.
arXiv Detail & Related papers (2022-11-07T14:52:58Z) - Geometric and Topological Inference for Deep Representations of Complex
Networks [13.173307471333619]
We present a class of statistics that emphasize the topology as well as the geometry of representations.
We evaluate these statistics in terms of the sensitivity and specificity that they afford when used for model selection.
These new methods enable brain and computer scientists to visualize the dynamic representational transformations learned by brains and models.
arXiv Detail & Related papers (2022-03-10T17:14:14Z) - Hermitian Symmetric Spaces for Graph Embeddings [0.0]
We learn continuous representations of graphs in spaces of symmetric matrices over C.
These spaces offer a rich geometry that simultaneously admits hyperbolic and Euclidean subspaces.
The proposed models are able to automatically adapt to very dissimilar arrangements without any apriori estimates of graph features.
arXiv Detail & Related papers (2021-05-11T18:14:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.