An Exploration of Conditioning Methods in Graph Neural Networks
- URL: http://arxiv.org/abs/2305.01933v1
- Date: Wed, 3 May 2023 07:14:12 GMT
- Title: An Exploration of Conditioning Methods in Graph Neural Networks
- Authors: Yeskendir Koishekenov, Erik J. Bekkers
- Abstract summary: In computational tasks such as physics and chemistry usage of edge attributes such as relative position or distance proved to be essential.
We consider three types of conditioning; weak, strong, and pure, which respectively relate to concatenation-based conditioning, gating, and transformations that are causally dependent on the attributes.
This categorization provides a unifying viewpoint on different classes of GNNs, from separable convolutions to various forms of message passing networks.
- Score: 8.532288965425805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The flexibility and effectiveness of message passing based graph neural
networks (GNNs) induced considerable advances in deep learning on
graph-structured data. In such approaches, GNNs recursively update node
representations based on their neighbors and they gain expressivity through the
use of node and edge attribute vectors. E.g., in computational tasks such as
physics and chemistry usage of edge attributes such as relative position or
distance proved to be essential. In this work, we address not what kind of
attributes to use, but how to condition on this information to improve model
performance. We consider three types of conditioning; weak, strong, and pure,
which respectively relate to concatenation-based conditioning, gating, and
transformations that are causally dependent on the attributes. This
categorization provides a unifying viewpoint on different classes of GNNs, from
separable convolutions to various forms of message passing networks. We provide
an empirical study on the effect of conditioning methods in several tasks in
computational chemistry.
Related papers
- ClassContrast: Bridging the Spatial and Contextual Gaps for Node Representations [7.083346385003788]
Graph Neural Networks (GNNs) have revolutionized the domain of graph representation learning by utilizing neighborhood aggregation schemes.
MPGNNs face significant issues, such as oversquashing, oversmoothing, and underreaching, which hamper their effectiveness.
We propose a novel approach, ClassContrast, grounded in Energy Landscape Theory from Chemical Physics, to overcome these limitations.
arXiv Detail & Related papers (2024-10-03T02:44:13Z) - You do not have to train Graph Neural Networks at all on text-attributed graphs [25.044734252779975]
We introduce TrainlessGNN, a linear GNN model capitalizing on the observation that text encodings from the same class often cluster together in a linear subspace.
Our experiments reveal that our trainless models can either match or even surpass their conventionally trained counterparts.
arXiv Detail & Related papers (2024-04-17T02:52:11Z) - AGHINT: Attribute-Guided Representation Learning on Heterogeneous Information Networks with Transformer [4.01252998015631]
We investigate the impact of inter-node attribute disparities on HGNNs performance within a benchmark task.
We propose a novel Attribute-Guided heterogeneous Information Networks representation learning model with Transformer (AGHINT)
AGHINT transcends the constraints of the original graph structure by directly integrating higher-order similar neighbor features into the learning process.
arXiv Detail & Related papers (2024-04-16T10:30:48Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - Simplifying approach to Node Classification in Graph Neural Networks [7.057970273958933]
We decouple the node feature aggregation step and depth of graph neural network, and empirically analyze how different aggregated features play a role in prediction performance.
We show that not all features generated via aggregation steps are useful, and often using these less informative features can be detrimental to the performance of the GNN model.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN), and show empirically that the proposed model achieves comparable or even higher accuracy than state-of-the-art GNN models.
arXiv Detail & Related papers (2021-11-12T14:53:22Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - My Body is a Cage: the Role of Morphology in Graph-Based Incompatible
Control [65.77164390203396]
We present a series of ablations on existing methods that show that morphological information encoded in the graph does not improve their performance.
Motivated by the hypothesis that any benefits GNNs extract from the graph structure are outweighed by difficulties they create for message passing, we also propose Amorpheus.
arXiv Detail & Related papers (2020-10-05T08:37:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.