Manifold GCN: Diffusion-based Convolutional Neural Network for Manifold-valued Graphs
- URL: http://arxiv.org/abs/2401.14381v2
- Date: Tue, 25 Feb 2025 12:31:31 GMT
- Title: Manifold GCN: Diffusion-based Convolutional Neural Network for Manifold-valued Graphs
- Authors: Martin Hanik, Gabriele Steidl, Christoph von Tycowicz,
- Abstract summary: We propose two graph neural network layers for graphs with features in a Riemannian manifold.<n>We show that both layers are equivariant under node permutations and the feature manifold's isometries.<n>While they apply to a much broader class of problems, they perform as well as or better than task-specific state-of-the-art networks.
- Score: 2.389598109913754
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose two graph neural network layers for graphs with features in a Riemannian manifold. First, based on a manifold-valued graph diffusion equation, we construct a diffusion layer that can be applied to an arbitrary number of nodes and graph connectivity patterns. Second, we model a tangent multilayer perceptron by transferring ideas from the vector neuron framework to our general setting. Both layers are equivariant under node permutations and the feature manifold's isometries. These properties have led to a beneficial inductive bias in many deep-learning tasks. Numerical examples on synthetic data and an Alzheimer's classification application on triangle meshes of the right hippocampus demonstrate the usefulness of our new layers: While they apply to a much broader class of problems, they perform as well as or better than task-specific state-of-the-art networks.
Related papers
- Detecting Homeomorphic 3-manifolds via Graph Neural Networks [0.0]
We study the homeomorphism problem for a class of graph-manifolds using Graph Neural Network techniques.
We train and benchmark a variety of network architectures within a supervised learning setting by testing different combinations of two convolutional layers.
arXiv Detail & Related papers (2024-09-01T12:58:09Z) - Transfer Entropy in Graph Convolutional Neural Networks [0.0]
Graph Convolutional Networks (GCN) are Graph Neural Networks where the convolutions are applied over a graph.
In this study, we address two important challenges related to GCNs: i.
Oversmoothing is the degradation of the discriminative capacity of nodes as a result of repeated aggregations.
We propose a new strategy for addressing these challenges in GCNs based on Transfer Entropy (TE), which measures of the amount of directed transfer of information between two time varying nodes.
arXiv Detail & Related papers (2024-06-08T20:09:17Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Learning to Approximate Adaptive Kernel Convolution on Graphs [4.434835769977399]
We propose a diffusion learning framework, where the range of feature aggregation is controlled by the scale of a diffusion kernel.
Our model is tested on various standard for node-wise classification for the state-of-the-art datasets performance.
It is also validated on a real-world brain network data for graph classifications to demonstrate its practicality for Alzheimer classification.
arXiv Detail & Related papers (2024-01-22T10:57:11Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Effects of Graph Convolutions in Deep Networks [8.937905773981702]
We present a rigorous theoretical understanding of the effects of graph convolutions in multi-layer networks.
We show that a single graph convolution expands the regime of the distance between the means where multi-layer networks can classify the data.
We provide both theoretical and empirical insights into the performance of graph convolutions placed in different combinations among the layers of a network.
arXiv Detail & Related papers (2022-04-20T08:24:43Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Geometric Graph Representation Learning via Maximizing Rate Reduction [73.6044873825311]
Learning node representations benefits various downstream tasks in graph analysis such as community detection and node classification.
We propose Geometric Graph Representation Learning (G2R) to learn node representations in an unsupervised manner.
G2R maps nodes in distinct groups into different subspaces, while each subspace is compact and different subspaces are dispersed.
arXiv Detail & Related papers (2022-02-13T07:46:24Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Multilayer Graph Clustering with Optimized Node Embedding [70.1053472751897]
multilayer graph clustering aims at dividing the graph nodes into categories or communities.
We propose a clustering-friendly embedding of the layers of a given multilayer graph.
Experiments show that our method leads to a significant improvement.
arXiv Detail & Related papers (2021-03-30T17:36:40Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.