Graph Anisotropic Diffusion
- URL: http://arxiv.org/abs/2205.00354v1
- Date: Sat, 30 Apr 2022 22:13:20 GMT
- Title: Graph Anisotropic Diffusion
- Authors: Ahmed A. A. Elhag, Gabriele Corso, Hannes St\"ark, Michael M.
Bronstein
- Abstract summary: We present a new GNN architecture called Graph Anisotropic Diffusion.
Our model alternates between linear diffusion, for which a closed-form solution is available, and local anisotropic filters to obtain efficient multi-hop anisotropic kernels.
- Score: 15.22604744349679
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditional Graph Neural Networks (GNNs) rely on message passing, which
amounts to permutation-invariant local aggregation of neighbour features. Such
a process is isotropic and there is no notion of `direction' on the graph. We
present a new GNN architecture called Graph Anisotropic Diffusion. Our model
alternates between linear diffusion, for which a closed-form solution is
available, and local anisotropic filters to obtain efficient multi-hop
anisotropic kernels. We test our model on two common molecular property
prediction benchmarks (ZINC and QM9) and show its competitive performance.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - GPatcher: A Simple and Adaptive MLP Model for Alleviating Graph
Heterophily [15.93465948768545]
We demystify the impact of graph heterophily on graph neural networks (GNNs) filters.
We propose a simple yet powerful GNN named GPatcher by leveraging the patch-Mixer architectures.
Our model demonstrates outstanding performance on node classification compared with popular homophily GNNs and state-of-the-art heterophily GNNs.
arXiv Detail & Related papers (2023-06-25T20:57:35Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Graph Neural Convection-Diffusion with Heterophily [32.234690120340964]
Graph neural networks (GNNs) have shown promising results across various graph learning tasks.
But they often assume homophily, which can result in poor performance on heterophilic graphs.
We propose a novel GNN that incorporates the principle of heterophily by modeling the flow of information on nodes.
arXiv Detail & Related papers (2023-05-26T09:47:03Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - GBK-GNN: Gated Bi-Kernel Graph Neural Networks for Modeling Both
Homophily and Heterophily [24.742449127169586]
Graph Neural Networks (GNNs) are widely used on a variety of graph-based machine learning tasks.
For node-level tasks, GNNs have strong power to model the homophily property of graphs.
We propose a novel GNN model based on a bi- kernel feature transformation and a selection gate.
arXiv Detail & Related papers (2021-10-29T13:44:09Z) - Directional Graph Networks [17.11861614285746]
We propose the first globally consistent anisotropic kernels for graph neural networks (GNNs)
By defining a vector field in the graph, we develop a method of applying directional derivatives and smoothing by projecting node-specific messages into the field.
We show that the method generalizes CNNs on an $n$-dimensional grid and is provably more discriminative than standard GNNs.
arXiv Detail & Related papers (2020-10-06T16:31:27Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.