Deep Learning for Learning Graph Representations
- URL: http://arxiv.org/abs/2001.00293v1
- Date: Thu, 2 Jan 2020 02:13:28 GMT
- Title: Deep Learning for Learning Graph Representations
- Authors: Wenwu Zhu, Xin Wang, Peng Cui
- Abstract summary: Mining graph data has become a popular research topic in computer science.
The huge amount of network data has posed great challenges for efficient analysis.
This motivates the advent of graph representation which maps the graph into a low-dimension vector space.
- Score: 58.649784596090385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mining graph data has become a popular research topic in computer science and
has been widely studied in both academia and industry given the increasing
amount of network data in the recent years. However, the huge amount of network
data has posed great challenges for efficient analysis. This motivates the
advent of graph representation which maps the graph into a low-dimension vector
space, keeping original graph structure and supporting graph inference. The
investigation on efficient representation of a graph has profound theoretical
significance and important realistic meaning, we therefore introduce some basic
ideas in graph representation/network embedding as well as some representative
models in this chapter.
Related papers
- Parametric Graph Representations in the Era of Foundation Models: A Survey and Position [69.48708136448694]
Graphs have been widely used in the past decades of big data and AI to model comprehensive relational data.
Identifying meaningful graph laws can significantly enhance the effectiveness of various applications.
arXiv Detail & Related papers (2024-10-16T00:01:31Z) - Does Graph Prompt Work? A Data Operation Perspective with Theoretical Analysis [7.309233340654514]
This paper introduces a theoretical framework that rigorously analyzes graph prompting from a data operation perspective.
We provide a formal guarantee theorem, demonstrating graph prompts capacity to approximate graph transformation operators.
We derive upper bounds on the error of these data operations by graph prompts for a single graph and extend this discussion to batches of graphs.
arXiv Detail & Related papers (2024-10-02T15:07:13Z) - A Comprehensive Survey on Graph Summarization with Graph Neural Networks [21.337505372979066]
In the past, most graph summarization techniques sought to capture the most important part of a graph statistically.
Today, the high dimensionality and complexity of modern graph data are making deep learning techniques more popular.
Our investigation includes a review of the current state-of-the-art approaches, including recurrent GNNs, convolutional GNNs, graph autoencoders, and graph attention networks.
arXiv Detail & Related papers (2023-02-13T05:43:24Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Graph Pooling for Graph Neural Networks: Progress, Challenges, and
Opportunities [128.55790219377315]
Graph neural networks have emerged as a leading architecture for many graph-level tasks.
graph pooling is indispensable for obtaining a holistic graph-level representation of the whole graph.
arXiv Detail & Related papers (2022-04-15T04:02:06Z) - Data Augmentation for Deep Graph Learning: A Survey [66.04015540536027]
We first propose a taxonomy for graph data augmentation and then provide a structured review by categorizing the related work based on the augmented information modalities.
Focusing on the two challenging problems in DGL (i.e., optimal graph learning and low-resource graph learning), we also discuss and review the existing learning paradigms which are based on graph data augmentation.
arXiv Detail & Related papers (2022-02-16T18:30:33Z) - Learning Graph Representations [0.0]
Graph Neural Networks (GNNs) are efficient ways to get insight into large dynamic graph datasets.
In this paper, we discuss the graph convolutional neural networks graph autoencoders and Social-temporal graph neural networks.
arXiv Detail & Related papers (2021-02-03T12:07:55Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Unsupervised Graph Representation by Periphery and Hierarchical
Information Maximization [18.7475578342125]
Invent of graph neural networks has improved the state-of-the-art for both node and the entire graph representation in a vector space.
For the entire graph representation, most of existing graph neural networks are trained on a graph classification loss in a supervised way.
We propose an unsupervised graph neural network to generate a vector representation of an entire graph in this paper.
arXiv Detail & Related papers (2020-06-08T15:50:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.