A Comprehensive Analytical Survey on Unsupervised and Semi-Supervised
Graph Representation Learning Methods
- URL: http://arxiv.org/abs/2112.10372v1
- Date: Mon, 20 Dec 2021 07:50:26 GMT
- Title: A Comprehensive Analytical Survey on Unsupervised and Semi-Supervised
Graph Representation Learning Methods
- Authors: Md. Khaledur Rahman and Ariful Azad
- Abstract summary: This survey aims to evaluate all major classes of graph embedding methods.
We organized graph embedding techniques using a taxonomy that includes methods from manual feature engineering, matrix factorization, shallow neural networks, and deep graph convolutional networks.
We designed experiments on top of PyTorch Geometric and DGL libraries and run experiments on different multicore CPU and GPU platforms.
- Score: 4.486285347896372
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph representation learning is a fast-growing field where one of the main
objectives is to generate meaningful representations of graphs in
lower-dimensional spaces. The learned embeddings have been successfully applied
to perform various prediction tasks, such as link prediction, node
classification, clustering, and visualization. The collective effort of the
graph learning community has delivered hundreds of methods, but no single
method excels under all evaluation metrics such as prediction accuracy, running
time, scalability, etc. This survey aims to evaluate all major classes of graph
embedding methods by considering algorithmic variations, parameter selections,
scalability, hardware and software platforms, downstream ML tasks, and diverse
datasets. We organized graph embedding techniques using a taxonomy that
includes methods from manual feature engineering, matrix factorization, shallow
neural networks, and deep graph convolutional networks. We evaluated these
classes of algorithms for node classification, link prediction, clustering, and
visualization tasks using widely used benchmark graphs. We designed our
experiments on top of PyTorch Geometric and DGL libraries and run experiments
on different multicore CPU and GPU platforms. We rigorously scrutinize the
performance of embedding methods under various performance metrics and
summarize the results. Thus, this paper may serve as a comparative guide to
help users select methods that are most suitable for their tasks.
Related papers
- Knowledge Probing for Graph Representation Learning [12.960185655357495]
We propose a novel graph probing framework (GraphProbe) to investigate and interpret whether the family of graph learning methods has encoded different levels of knowledge in graph representation learning.
Based on the intrinsic properties of graphs, we design three probes to systematically investigate the graph representation learning process from different perspectives.
We construct a thorough evaluation benchmark with nine representative graph learning methods from random walk based approaches, basic graph neural networks and self-supervised graph methods, and probe them on six benchmark datasets for node classification, link prediction and graph classification.
arXiv Detail & Related papers (2024-08-07T16:27:45Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Hub-aware Random Walk Graph Embedding Methods for Classification [44.99833362998488]
We propose two novel graph embedding algorithms based on random walks that are specifically designed for the node classification problem.
The proposed methods are experimentally evaluated by analyzing the classification performance of three classification algorithms trained on embeddings of real-world networks.
arXiv Detail & Related papers (2022-09-15T20:41:18Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous
Graphs [21.617020380894488]
We propose a novel self-supervised auxiliary learning method to learn graph neural networks on heterogeneous graphs.
Our method can be applied to any graph neural networks in a plug-in manner without manual labeling or additional data.
arXiv Detail & Related papers (2020-07-16T12:32:11Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.