Efficiently Visualizing Large Graphs
- URL: http://arxiv.org/abs/2310.11186v1
- Date: Tue, 17 Oct 2023 12:07:14 GMT
- Title: Efficiently Visualizing Large Graphs
- Authors: Xinyu Li, Yao Xiao, Yuchen Zhou
- Abstract summary: We propose a novel dimension reduction method for graph visualization, called t-Distributed Graph Neighbor Embedding (t-SGNE)
t-SGNE is specifically designed to visualize cluster structures in the graph.
To suit t-SGNE, we combined Laplacian Eigenmaps with the shortest path algorithm in graphs to form the graph embedding algorithm ShortestPath Laplacian Eigenmaps Embedding (SPLEE)
- Score: 18.764862799181053
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Most existing graph visualization methods based on dimension reduction are
limited to relatively small graphs due to performance issues. In this work, we
propose a novel dimension reduction method for graph visualization, called
t-Distributed Stochastic Graph Neighbor Embedding (t-SGNE). t-SGNE is
specifically designed to visualize cluster structures in the graph. As a
variant of the standard t-SNE method, t-SGNE avoids the time-consuming
computations of pairwise similarity. Instead, it uses the neighbor structures
of the graph to reduce the time complexity from quadratic to linear, thus
supporting larger graphs. In addition, to suit t-SGNE, we combined Laplacian
Eigenmaps with the shortest path algorithm in graphs to form the graph
embedding algorithm ShortestPath Laplacian Eigenmaps Embedding (SPLEE).
Performing SPLEE to obtain a high-dimensional embedding of the large-scale
graph and then using t-SGNE to reduce its dimension for visualization, we are
able to visualize graphs with up to 300K nodes and 1M edges within 5 minutes
and achieve approximately 10% improvement in visualization quality. Codes and
data are available at
https://github.com/Charlie-XIAO/embedding-visualization-test.
Related papers
- Learning on Large Graphs using Intersecting Communities [13.053266613831447]
MPNNs iteratively update each node's representation in an input graph by aggregating messages from the node's neighbors.
MPNNs might quickly become prohibitive for large graphs provided they are not very sparse.
We propose approximating the input graph as an intersecting community graph (ICG) -- a combination of intersecting cliques.
arXiv Detail & Related papers (2024-05-31T09:26:26Z) - Edge-Parallel Graph Encoder Embedding [0.0]
One-Hot Graph Embedding (GEE) uses a single, linear pass over edges and produces an embedding that convergesally to the spectral embedding.
We present a parallel program in the Ligra graph engine that maps functions over the edges of the graph and uses lock-free atomic instrutions to prevent data races.
arXiv Detail & Related papers (2024-02-06T21:04:57Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Boosting Graph Embedding on a Single GPU [3.093890460224435]
We present GOSH, a GPU-based tool for embedding large-scale graphs with minimum hardware constraints.
It employs a novel graph coarsening algorithm to enhance the impact of updates and minimize the work for embedding.
It also incorporates a decomposition schema that enables any arbitrarily large graph to be embedded with a single GPU.
arXiv Detail & Related papers (2021-10-19T15:25:04Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Wasserstein Embedding for Graph Learning [33.90471037116372]
Wasserstein Embedding for Graph Learning (WEGL) is a framework for embedding entire graphs in a vector space.
We leverage new insights on defining similarity between graphs as a function of the similarity between their node embedding distributions.
We evaluate our new graph embedding approach on various benchmark graph-property prediction tasks.
arXiv Detail & Related papers (2020-06-16T18:23:00Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Wasserstein-based Graph Alignment [56.84964475441094]
We cast a new formulation for the one-to-many graph alignment problem, which aims at matching a node in the smaller graph with one or more nodes in the larger graph.
We show that our method leads to significant improvements with respect to the state-of-the-art algorithms for each of these tasks.
arXiv Detail & Related papers (2020-03-12T22:31:59Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.