Algebraic Topological Networks via the Persistent Local Homology Sheaf
- URL: http://arxiv.org/abs/2311.10156v1
- Date: Thu, 16 Nov 2023 19:24:20 GMT
- Title: Algebraic Topological Networks via the Persistent Local Homology Sheaf
- Authors: Gabriele Cesa, Arash Behboodi
- Abstract summary: We introduce a novel approach to enhance graph convolution and attention modules by incorporating local topological properties of the data.
We consider the framework of sheaf neural networks, which has been previously leveraged to incorporate additional structure into graph neural networks' features.
- Score: 15.17547132363788
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we introduce a novel approach based on algebraic topology to
enhance graph convolution and attention modules by incorporating local
topological properties of the data. To do so, we consider the framework of
sheaf neural networks, which has been previously leveraged to incorporate
additional structure into graph neural networks' features and construct more
expressive, non-isotropic messages. Specifically, given an input simplicial
complex (e.g. generated by the cliques of a graph or the neighbors in a point
cloud), we construct its local homology sheaf, which assigns to each node the
vector space of its local homology. The intermediate features of our networks
live in these vector spaces and we leverage the associated sheaf Laplacian to
construct more complex linear messages between them. Moreover, we extend this
approach by considering the persistent version of local homology associated
with a weighted simplicial complex (e.g., built from pairwise distances of
nodes embeddings). This i) solves the problem of the lack of a natural choice
of basis for the local homology vector spaces and ii) makes the sheaf itself
differentiable, which enables our models to directly optimize the topology of
their intermediate features.
Related papers
- SpaceMesh: A Continuous Representation for Learning Manifold Surface Meshes [61.110517195874074]
We present a scheme to directly generate manifold, polygonal meshes of complex connectivity as the output of a neural network.
Our key innovation is to define a continuous latent connectivity space at each mesh, which implies the discrete mesh.
In applications, this approach not only yields high-quality outputs from generative models, but also enables directly learning challenging geometry processing tasks such as mesh repair.
arXiv Detail & Related papers (2024-09-30T17:59:03Z) - A rank decomposition for the topological classification of neural representations [0.0]
In this work, we leverage the fact that neural networks are equivalent to continuous piecewise-affine maps.
We study the homology groups of the quotient of a manifold $mathcalM$ and a subset $A$, assuming some minimal properties on these spaces.
We show that in randomly narrow networks, there will be regions in which the (co)homology groups of a data manifold can change.
arXiv Detail & Related papers (2024-04-30T17:01:20Z) - Topological Neural Networks: Mitigating the Bottlenecks of Graph Neural
Networks via Higher-Order Interactions [1.994307489466967]
This work starts with a theoretical framework to reveal the impact of network's width, depth, and graph topology on the over-squashing phenomena in message-passing neural networks.
The work drifts towards, higher-order interactions and multi-relational inductive biases via Topological Neural Networks.
Inspired by Graph Attention Networks, two topological attention networks are proposed: Simplicial and Cell Attention Networks.
arXiv Detail & Related papers (2024-02-10T08:26:06Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Field Convolutions for Surface CNNs [19.897276088740995]
We present a novel surface convolution operator acting on vector fields based on a simple observation.
This formulation combines intrinsic spatial convolution with parallel transport in a scattering operation.
We achieve state-of-the-art results on standard benchmarks in fundamental geometry processing tasks.
arXiv Detail & Related papers (2021-04-08T17:11:14Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.