TREE-G: Decision Trees Contesting Graph Neural Networks
- URL: http://arxiv.org/abs/2207.02760v5
- Date: Sun, 25 Feb 2024 22:30:43 GMT
- Title: TREE-G: Decision Trees Contesting Graph Neural Networks
- Authors: Maya Bechler-Speicher, Amir Globerson, Ran Gilad-Bachrach
- Abstract summary: TREE-G modifies standard decision trees, by introducing a novel split function that is specialized for graph data.
We show that TREE-G consistently outperforms other tree-based models and often outperforms other graph-learning algorithms such as Graph Neural Networks (GNNs) and Graph Kernels.
- Score: 33.364191419692105
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: When dealing with tabular data, models based on decision trees are a popular
choice due to their high accuracy on these data types, their ease of
application, and explainability properties. However, when it comes to
graph-structured data, it is not clear how to apply them effectively, in a way
that incorporates the topological information with the tabular data available
on the vertices of the graph. To address this challenge, we introduce TREE-G.
TREE-G modifies standard decision trees, by introducing a novel split function
that is specialized for graph data. Not only does this split function
incorporate the node features and the topological information, but it also uses
a novel pointer mechanism that allows split nodes to use information computed
in previous splits. Therefore, the split function adapts to the predictive task
and the graph at hand. We analyze the theoretical properties of TREE-G and
demonstrate its benefits empirically on multiple graph and vertex prediction
benchmarks. In these experiments, TREE-G consistently outperforms other
tree-based models and often outperforms other graph-learning algorithms such as
Graph Neural Networks (GNNs) and Graph Kernels, sometimes by large margins.
Moreover, TREE-Gs models and their predictions can be explained and visualized
Related papers
- The GECo algorithm for Graph Neural Networks Explanation [0.0]
We introduce a new methodology involving graph communities to address the interpretability of graph classification problems.
The proposed method, called GECo, exploits the idea that if a community is a subset of graph nodes densely connected, this property should play a role in graph classification.
The obtained results outperform the other methods for artificial graph datasets and most real-world datasets.
arXiv Detail & Related papers (2024-11-18T09:08:30Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - GTNet: A Tree-Based Deep Graph Learning Architecture [8.50892442127182]
We propose a deep graph learning architecture with a new general message passing scheme that originates from the tree representation of graphs.
Two graph representation learning models are proposed within this GTNet architecture - Graph Tree Attention Network (GTAN) and Graph Tree Convolution Network (GTCN)
arXiv Detail & Related papers (2022-04-27T09:43:14Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Neural Trees for Learning on Graphs [19.05038106825347]
Graph Neural Networks (GNNs) have emerged as a flexible and powerful approach for learning over graphs.
We propose a new GNN architecture -- the Neural Tree.
We show that the neural tree architecture can approximate any smooth probability distribution function over an undirected graph.
arXiv Detail & Related papers (2021-05-15T17:08:20Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.