Graph Neural Networks for Node-Level Predictions
- URL: http://arxiv.org/abs/2007.08649v1
- Date: Mon, 22 Jun 2020 11:57:03 GMT
- Title: Graph Neural Networks for Node-Level Predictions
- Authors: Christoph Heindl
- Abstract summary: Deep learning has revolutionized many fields of research including areas of computer vision, text and speech processing.
This work aims to provide an overview of early and modern graph neural network based machine learning methods for node-level prediction tasks.
- Score: 0.7310043452300736
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The success of deep learning has revolutionized many fields of research
including areas of computer vision, text and speech processing. Enormous
research efforts have led to numerous methods that are capable of efficiently
analyzing data, especially in the Euclidean space. However, many problems are
posed in non-Euclidean domains modeled as general graphs with complex
connection patterns. Increased problem complexity and computational power
constraints have limited early approaches to static and small-sized graphs. In
recent years, a rising interest in machine learning on graph-structured data
has been accompanied by improved methods that overcome the limitations of their
predecessors. These methods paved the way for dealing with large-scale and
time-dynamic graphs. This work aims to provide an overview of early and modern
graph neural network based machine learning methods for node-level prediction
tasks. Under the umbrella of taxonomies already established in the literature,
we explain the core concepts and provide detailed explanations for
convolutional methods that have had strong impact. In addition, we introduce
common benchmarks and present selected applications from various areas.
Finally, we discuss open problems for further research.
Related papers
- Foundations and Frontiers of Graph Learning Theory [81.39078977407719]
Recent advancements in graph learning have revolutionized the way to understand and analyze data with complex structures.
Graph Neural Networks (GNNs), i.e. neural network architectures designed for learning graph representations, have become a popular paradigm.
This article provides a comprehensive summary of the theoretical foundations and breakthroughs concerning the approximation and learning behaviors intrinsic to prevalent graph learning models.
arXiv Detail & Related papers (2024-07-03T14:07:41Z) - A Survey of Data-Efficient Graph Learning [16.053913182723143]
We introduce a novel concept of Data-Efficient Graph Learning (DEGL) as a research frontier.
We systematically review recent advances on several key aspects, including self-supervised graph learning, semi-supervised graph learning, and few-shot graph learning.
arXiv Detail & Related papers (2024-02-01T09:28:48Z) - A Comprehensive Survey on Deep Graph Representation Learning [26.24869157855632]
Graph representation learning aims to encode high-dimensional sparse graph-structured data into low-dimensional dense vectors.
Traditional methods have limited model capacity which limits the learning performance.
Deep graph representation learning has shown great potential and advantages over shallow (traditional) methods.
arXiv Detail & Related papers (2023-04-11T08:23:52Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - A Complex Network based Graph Embedding Method for Link Prediction [0.0]
We present a novel graph embedding approach based on the popularity-similarity and local attraction paradigms.
We show, using extensive experimental analysis, that the proposed method outperforms state-of-the-art graph embedding algorithms.
arXiv Detail & Related papers (2022-09-11T14:46:38Z) - Learning node embeddings via summary graphs: a brief theoretical
analysis [55.25628709267215]
Graph representation learning plays an important role in many graph mining applications, but learning embeddings of large-scale graphs remains a problem.
Recent works try to improve scalability via graph summarization -- i.e., they learn embeddings on a smaller summary graph, and then restore the node embeddings of the original graph.
We give an in-depth theoretical analysis of three specific embedding learning methods based on introduced kernel matrix.
arXiv Detail & Related papers (2022-07-04T04:09:50Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - A Gentle Introduction to Deep Learning for Graphs [23.809161531445053]
This work is designed as a tutorial introduction to the field of deep learning for graphs.
It introduces a general formulation of graph representation learning based on a local and iterative approach to structured information processing.
It introduces the basic building blocks that can be combined to design novel and effective neural models for graphs.
arXiv Detail & Related papers (2019-12-29T16:43:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.