Addressing Heterophily in Node Classification with Graph Echo State
Networks
- URL: http://arxiv.org/abs/2305.08233v2
- Date: Mon, 3 Jul 2023 19:50:24 GMT
- Title: Addressing Heterophily in Node Classification with Graph Echo State
Networks
- Authors: Alessio Micheli, Domenico Tortorella
- Abstract summary: We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
- Score: 11.52174067809364
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Node classification tasks on graphs are addressed via fully-trained deep
message-passing models that learn a hierarchy of node representations via
multiple aggregations of a node's neighbourhood. While effective on graphs that
exhibit a high ratio of intra-class edges, this approach poses challenges in
the opposite case, i.e. heterophily, where nodes belonging to the same class
are usually further apart. In graphs with a high degree of heterophily, the
smoothed representations based on close neighbours computed by convolutional
models are no longer effective. So far, architectural variations in
message-passing models to reduce excessive smoothing or rewiring the input
graph to improve longer-range message passing have been proposed. In this
paper, we address the challenges of heterophilic graphs with Graph Echo State
Network (GESN) for node classification. GESN is a reservoir computing model for
graphs, where node embeddings are recursively computed by an untrained
message-passing function. Our experiments show that reservoir models are able
to achieve better or comparable accuracy with respect to most fully trained
deep models that implement ad hoc variations in the architectural bias or
perform rewiring as a preprocessing step on the input graph, with an
improvement in terms of efficiency/accuracy trade-off. Furthermore, our
analysis shows that GESN is able to effectively encode the structural
relationships of a graph node, by showing a correlation between iterations of
the recursive embedding function and the distribution of shortest paths in a
graph.
Related papers
- Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Is Rewiring Actually Helpful in Graph Neural Networks? [11.52174067809364]
We propose an evaluation setting based on message-passing models that do not require training to compute node and graph representations.
We perform a systematic experimental comparison on real-world node and graph classification tasks, showing that rewiring the underlying graph rarely does confer a practical benefit for message-passing.
arXiv Detail & Related papers (2023-05-31T10:12:23Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Accurate Learning of Graph Representations with Graph Multiset Pooling [45.72542969364438]
We propose a Graph Multiset Transformer (GMT) that captures the interaction between nodes according to their structural dependencies.
Our experimental results show that GMT significantly outperforms state-of-the-art graph pooling methods on graph classification benchmarks.
arXiv Detail & Related papers (2021-02-23T07:45:58Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.