Graph-level representations using ensemble-based readout functions
- URL: http://arxiv.org/abs/2303.02023v2
- Date: Thu, 20 Apr 2023 12:40:04 GMT
- Title: Graph-level representations using ensemble-based readout functions
- Authors: Jakub Binkowski, Albert Sawczyn, Denis Janiak, Piotr Bielak, Tomasz
Kajdanowicz
- Abstract summary: Graph machine learning models have been successfully deployed in a variety of application areas.
One of the most prominent types of models - Graph Neural Networks (GNNs) - provides an elegant way of extracting expressive node-level representation vectors.
We introduce a concept of ensemble-based readout functions that combine either representations or predictions.
- Score: 3.630365560970225
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph machine learning models have been successfully deployed in a variety of
application areas. One of the most prominent types of models - Graph Neural
Networks (GNNs) - provides an elegant way of extracting expressive node-level
representation vectors, which can be used to solve node-related problems, such
as classifying users in a social network. However, many tasks require
representations at the level of the whole graph, e.g., molecular applications.
In order to convert node-level representations into a graph-level vector, a
so-called readout function must be applied. In this work, we study existing
readout methods, including simple non-trainable ones, as well as complex,
parametrized models. We introduce a concept of ensemble-based readout functions
that combine either representations or predictions. Our experiments show that
such ensembles allow for better performance than simple single readouts or
similar performance as the complex, parametrized ones, but at a fraction of the
model complexity.
Related papers
- Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - SGAT: Simplicial Graph Attention Network [38.7842803074593]
Heterogeneous graphs have multiple node and edge types and are semantically richer than homogeneous graphs.
Many graph neural network approaches for heterogeneous graphs use metapaths to capture multi-hop interactions between nodes.
We present Simplicial Graph Attention Network (SGAT), a simplicial complex approach to represent such high-order interactions.
arXiv Detail & Related papers (2022-07-24T15:20:41Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Permutation-Invariant Variational Autoencoder for Graph-Level
Representation Learning [0.0]
We propose a permutation-invariant variational autoencoder for graph structured data.
Our model indirectly learns to match the node ordering of input and output graph, without imposing a particular node ordering.
We demonstrate the effectiveness of our proposed model on various graph reconstruction and generation tasks.
arXiv Detail & Related papers (2021-04-20T09:44:41Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Persona2vec: A Flexible Multi-role Representations Learning Framework
for Graphs [4.133483293243257]
persona2vec is a graph embedding framework that efficiently learns multiple representations of nodes based on their structural contexts.
We show that our framework is significantly faster than the existing state-of-the-art model while achieving better performance.
arXiv Detail & Related papers (2020-06-04T20:03:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.