Directional Graph Networks
- URL: http://arxiv.org/abs/2010.02863v4
- Date: Wed, 7 Apr 2021 18:50:16 GMT
- Title: Directional Graph Networks
- Authors: Dominique Beaini, Saro Passaro, Vincent L\'etourneau, William L.
Hamilton, Gabriele Corso, Pietro Li\`o
- Abstract summary: We propose the first globally consistent anisotropic kernels for graph neural networks (GNNs)
By defining a vector field in the graph, we develop a method of applying directional derivatives and smoothing by projecting node-specific messages into the field.
We show that the method generalizes CNNs on an $n$-dimensional grid and is provably more discriminative than standard GNNs.
- Score: 17.11861614285746
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The lack of anisotropic kernels in graph neural networks (GNNs) strongly
limits their expressiveness, contributing to well-known issues such as
over-smoothing. To overcome this limitation, we propose the first globally
consistent anisotropic kernels for GNNs, allowing for graph convolutions that
are defined according to topologicaly-derived directional flows. First, by
defining a vector field in the graph, we develop a method of applying
directional derivatives and smoothing by projecting node-specific messages into
the field. Then, we propose the use of the Laplacian eigenvectors as such
vector field. We show that the method generalizes CNNs on an $n$-dimensional
grid and is provably more discriminative than standard GNNs regarding the
Weisfeiler-Lehman 1-WL test. We evaluate our method on different standard
benchmarks and see a relative error reduction of 8% on the CIFAR10 graph
dataset and 11% to 32% on the molecular ZINC dataset, and a relative increase
in precision of 1.6% on the MolPCBA dataset. An important outcome of this work
is that it enables graph networks to embed directions in an unsupervised way,
thus allowing a better representation of the anisotropic features in different
physical or biological problems.
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Enhanced Expressivity in Graph Neural Networks with Lanczos-Based Linear Constraints [7.605749412696919]
Graph Neural Networks (GNNs) excel in handling graph-structured data but often underperform in link prediction tasks.
We present a novel method to enhance the expressivity of GNNs by embedding induced subgraphs into the graph Laplacian matrix's eigenbasis.
Our method achieves 20x and 10x speedup by only requiring 5% and 10% data from the PubMed and OGBL-Vessel datasets.
arXiv Detail & Related papers (2024-08-22T12:22:00Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Edge Directionality Improves Learning on Heterophilic Graphs [42.5099159786891]
We introduce Directed Graph Neural Network (Dir-GNN), a novel framework for deep learning on directed graphs.
Dir-GNN can be used to extend any Message Passing Neural Network (MPNN) to account for edge directionality information.
We prove that Dir-GNN matches the expressivity of the Directed Weisfeiler-Lehman test, exceeding that of conventional MPNNs.
arXiv Detail & Related papers (2023-05-17T18:06:43Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Invertible Neural Networks for Graph Prediction [22.140275054568985]
In this work, we address conditional generation using deep invertible neural networks.
We adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once.
arXiv Detail & Related papers (2022-06-02T17:28:33Z) - Graph Anisotropic Diffusion [15.22604744349679]
We present a new GNN architecture called Graph Anisotropic Diffusion.
Our model alternates between linear diffusion, for which a closed-form solution is available, and local anisotropic filters to obtain efficient multi-hop anisotropic kernels.
arXiv Detail & Related papers (2022-04-30T22:13:20Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.