Introduction to Graph Neural Networks: A Starting Point for Machine Learning Engineers
- URL: http://arxiv.org/abs/2412.19419v1
- Date: Fri, 27 Dec 2024 03:13:02 GMT
- Title: Introduction to Graph Neural Networks: A Starting Point for Machine Learning Engineers
- Authors: James H. Tanis, Chris Giannella, Adrian V. Mariano,
- Abstract summary: Graph neural networks are deep neural networks designed for graphs with attributes attached to nodes or edges.
This survey introduces graph neural networks through the encoder-decoder framework and provides examples of decoders for a range of graph analytic tasks.
- Score: 0.0
- License:
- Abstract: Graph neural networks are deep neural networks designed for graphs with attributes attached to nodes or edges. The number of research papers in the literature concerning these models is growing rapidly due to their impressive performance on a broad range of tasks. This survey introduces graph neural networks through the encoder-decoder framework and provides examples of decoders for a range of graph analytic tasks. It uses theory and numerous experiments on homogeneous graphs to illustrate the behavior of graph neural networks for different training sizes and degrees of graph complexity.
Related papers
- Superhypergraph Neural Networks and Plithogenic Graph Neural Networks: Theoretical Foundations [0.0]
Hypergraphs extend traditional graphs by allowing edges to connect multiple nodes, while superhypergraphs further generalize this concept to represent even more complex relationships.
Graph Neural Networks (GNNs), a well-established framework, have recently been extended to Hypergraph Neural Networks (HGNNs)
This paper establishes the theoretical foundation for the development of SuperHyperGraph Neural Networks (SHGNNs) and Plithogenic Graph Neural Networks.
arXiv Detail & Related papers (2024-12-02T06:33:02Z) - Graphs Unveiled: Graph Neural Networks and Graph Generation [0.0]
This paper provides a comprehensive overview of Graph Neural Networks (GNNs)
We discuss the applications of graph neural networks across various domains.
We present an advanced field in GNNs: graph generation.
arXiv Detail & Related papers (2024-03-18T14:37:27Z) - A Survey on Graph Classification and Link Prediction based on GNN [11.614366568937761]
This review article delves into the world of graph convolutional neural networks.
It elaborates on the fundamentals of graph convolutional neural networks.
It elucidates the graph neural network models based on attention mechanisms and autoencoders.
arXiv Detail & Related papers (2023-07-03T09:08:01Z) - Knowledge Enhanced Graph Neural Networks for Graph Completion [0.0]
Knowledge Enhanced Graph Neural Networks (KeGNN) is a neuro-symbolic framework for graph completion.
KeGNN consists of a graph neural network as a base upon which knowledge enhancement layers are stacked.
We instantiate KeGNN in conjunction with two state-of-the-art graph neural networks, Graph Convolutional Networks and Graph Attention Networks.
arXiv Detail & Related papers (2023-03-27T07:53:43Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Capsule Graph Neural Networks with EM Routing [8.632437524560133]
This paper proposed novel Capsule Graph Neural Networks that use the EM routing mechanism (CapsGNNEM) to generate high-quality graph embeddings.
Experimental results on a number of real-world graph datasets demonstrate that the proposed CapsGNNEM outperforms nine state-of-the-art models in graph classification tasks.
arXiv Detail & Related papers (2021-10-18T06:23:37Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.