Utility of Equivariant Message Passing in Cortical Mesh Segmentation
- URL: http://arxiv.org/abs/2206.03164v1
- Date: Tue, 7 Jun 2022 10:24:18 GMT
- Title: Utility of Equivariant Message Passing in Cortical Mesh Segmentation
- Authors: D\'aniel Unyi, Ferdinando Insalata, Petar Veli\v{c}kovi\'c, B\'alint
Gyires-T\'oth
- Abstract summary: We investigate the utility of E(n)-equivariant graph neural networks (EGNNs) against plain graph neural networks (GNNs)
Our evaluation shows that GNNs outperform EGNNs on aligned meshes, due to their ability to leverage the presence of a global coordinate system.
- Score: 25.488181126364186
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The automated segmentation of cortical areas has been a long-standing
challenge in medical image analysis. The complex geometry of the cortex is
commonly represented as a polygon mesh, whose segmentation can be addressed by
graph-based learning methods. When cortical meshes are misaligned across
subjects, current methods produce significantly worse segmentation results,
limiting their ability to handle multi-domain data. In this paper, we
investigate the utility of E(n)-equivariant graph neural networks (EGNNs),
comparing their performance against plain graph neural networks (GNNs). Our
evaluation shows that GNNs outperform EGNNs on aligned meshes, due to their
ability to leverage the presence of a global coordinate system. On misaligned
meshes, the performance of plain GNNs drop considerably, while E(n)-equivariant
message passing maintains the same segmentation results. The best results can
also be obtained by using plain GNNs on realigned data (co-registered meshes in
a global coordinate system).
Related papers
- Diss-l-ECT: Dissecting Graph Data with local Euler Characteristic Transforms [13.608942872770855]
We introduce the Local Euler Characteristic Transform ($ell$-ECT) to enhance expressivity and interpretability in graph representation learning.
Unlike traditional Graph Neural Networks (GNNs), which may lose critical local details through aggregation, the $ell$-ECT provides a lossless representation of local neighborhoods.
Our method exhibits superior performance than standard GNNs on a variety of node classification tasks, particularly in graphs with high heterophily.
arXiv Detail & Related papers (2024-10-03T16:02:02Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Revisiting Heterophily For Graph Neural Networks [42.41238892727136]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using graph structures based on the relational inductive bias (homophily assumption)
Recent work has identified a non-trivial set of datasets where their performance compared to NNs is not satisfactory.
arXiv Detail & Related papers (2022-10-14T08:00:26Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Enhance Information Propagation for Graph Neural Network by
Heterogeneous Aggregations [7.3136594018091134]
Graph neural networks are emerging as continuation of deep learning success w.r.t. graph data.
We propose to enhance information propagation among GNN layers by combining heterogeneous aggregations.
We empirically validate the effectiveness of HAG-Net on a number of graph classification benchmarks.
arXiv Detail & Related papers (2021-02-08T08:57:56Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Clustering with Graph Neural Networks [5.305362965553278]
Graph Neural Networks (GNNs) have achieved state-of-the-art results on many graph analysis tasks.
Unsupervised problems on graphs, such as graph clustering, have proved more resistant to advances in GNNs.
We introduce Deep Modularity Networks (DMoN), an unsupervised pooling method inspired by the modularity measure of clustering quality.
arXiv Detail & Related papers (2020-06-30T15:30:49Z) - Non-Local Graph Neural Networks [60.28057802327858]
We propose a simple yet effective non-local aggregation framework with an efficient attention-guided sorting for GNNs.
We perform thorough experiments to analyze disassortative graph datasets and evaluate our non-local GNNs.
arXiv Detail & Related papers (2020-05-29T14:50:27Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.