Graph Neural Networks for Tabular Data Learning: A Survey with Taxonomy
and Directions
- URL: http://arxiv.org/abs/2401.02143v1
- Date: Thu, 4 Jan 2024 08:49:10 GMT
- Title: Graph Neural Networks for Tabular Data Learning: A Survey with Taxonomy
and Directions
- Authors: Cheng-Te Li, Yu-Che Tsai, Chih-Yao Chen, Jay Chiehen Liao
- Abstract summary: We dive into Tabular Data Learning using Graph Neural Networks (GNNs)
GNNs have garnered significant interest and application across various Tabular Data Learning domains.
This survey serves as a resource for researchers and practitioners, offering a thorough understanding of GNNs' role in revolutionizing TDL.
- Score: 10.753191494611892
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this survey, we dive into Tabular Data Learning (TDL) using Graph Neural
Networks (GNNs), a domain where deep learning-based approaches have
increasingly shown superior performance in both classification and regression
tasks compared to traditional methods. The survey highlights a critical gap in
deep neural TDL methods: the underrepresentation of latent correlations among
data instances and feature values. GNNs, with their innate capability to model
intricate relationships and interactions between diverse elements of tabular
data, have garnered significant interest and application across various TDL
domains. Our survey provides a systematic review of the methods involved in
designing and implementing GNNs for TDL (GNN4TDL). It encompasses a detailed
investigation into the foundational aspects and an overview of GNN-based TDL
methods, offering insights into their evolving landscape. We present a
comprehensive taxonomy focused on constructing graph structures and
representation learning within GNN-based TDL methods. In addition, the survey
examines various training plans, emphasizing the integration of auxiliary tasks
to enhance the effectiveness of instance representations. A critical part of
our discussion is dedicated to the practical application of GNNs across a
spectrum of GNN4TDL scenarios, demonstrating their versatility and impact.
Lastly, we discuss the limitations and propose future research directions,
aiming to spur advancements in GNN4TDL. This survey serves as a resource for
researchers and practitioners, offering a thorough understanding of GNNs' role
in revolutionizing TDL and pointing towards future innovations in this
promising area.
Related papers
- Deep Graph Anomaly Detection: A Survey and New Perspectives [86.84201183954016]
Graph anomaly detection (GAD) aims to identify unusual graph instances (nodes, edges, subgraphs, or graphs)
Deep learning approaches, graph neural networks (GNNs) in particular, have been emerging as a promising paradigm for GAD.
arXiv Detail & Related papers (2024-09-16T03:05:11Z) - Supervised Gradual Machine Learning for Aspect Category Detection [0.9857683394266679]
Aspect Category Detection (ACD) aims to identify implicit and explicit aspects in a given review sentence.
We propose a novel approach to tackle the ACD task by combining Deep Neural Networks (DNNs) with Gradual Machine Learning (GML) in a supervised setting.
arXiv Detail & Related papers (2024-04-08T07:21:46Z) - Exploring Causal Learning through Graph Neural Networks: An In-depth
Review [12.936700685252145]
We introduce a novel taxonomy that encompasses various state-of-the-art GNN methods employed in studying causality.
GNNs are further categorized based on their applications in the causality domain.
This review also touches upon the application of causal learning across diverse sectors.
arXiv Detail & Related papers (2023-11-25T10:46:06Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - A Survey on Explainability of Graph Neural Networks [4.612101932762187]
Graph neural networks (GNNs) are powerful graph-based deep-learning models.
This survey aims to provide a comprehensive overview of the existing explainability techniques for GNNs.
arXiv Detail & Related papers (2023-06-02T23:36:49Z) - Experimental Observations of the Topology of Convolutional Neural
Network Activations [2.4235626091331737]
Topological data analysis provides compact, noise-robust representations of complex structures.
Deep neural networks (DNNs) learn millions of parameters associated with a series of transformations defined by the model architecture.
In this paper, we apply cutting edge techniques from TDA with the goal of gaining insight into the interpretability of convolutional neural networks used for image classification.
arXiv Detail & Related papers (2022-12-01T02:05:44Z) - Towards Open-World Feature Extrapolation: An Inductive Graph Learning
Approach [80.8446673089281]
We propose a new learning paradigm with graph representation and learning.
Our framework contains two modules: 1) a backbone network (e.g., feedforward neural nets) as a lower model takes features as input and outputs predicted labels; 2) a graph neural network as an upper model learns to extrapolate embeddings for new features via message passing over a feature-data graph built from observed data.
arXiv Detail & Related papers (2021-10-09T09:02:45Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Quantifying Challenges in the Application of Graph Representation
Learning [0.0]
We provide an application oriented perspective to a set of popular embedding approaches.
We evaluate their representational power with respect to real-world graph properties.
Our results suggest that "one-to-fit-all" GRL approaches are hard to define in real-world scenarios.
arXiv Detail & Related papers (2020-06-18T03:19:43Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.