Fast Sequence-Based Embedding with Diffusion Graphs
- URL: http://arxiv.org/abs/2001.07463v1
- Date: Tue, 21 Jan 2020 12:04:21 GMT
- Title: Fast Sequence-Based Embedding with Diffusion Graphs
- Authors: Benedek Rozemberczki and Rik Sarkar
- Abstract summary: We propose diffusion graphs as a method to rapidly generate sequences for network embedding.
Its computational efficiency is superior to previous methods due to simpler sequence generation.
In a community detection task, clustering nodes in the embedding space produces better results compared to other sequence-based embedding methods.
- Score: 8.147652597876862
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: A graph embedding is a representation of graph vertices in a low-dimensional
space, which approximately preserves properties such as distances between
nodes. Vertex sequence-based embedding procedures use features extracted from
linear sequences of nodes to create embeddings using a neural network. In this
paper, we propose diffusion graphs as a method to rapidly generate vertex
sequences for network embedding. Its computational efficiency is superior to
previous methods due to simpler sequence generation, and it produces more
accurate results. In experiments, we found that the performance relative to
other methods improves with increasing edge density in the graph. In a
community detection task, clustering nodes in the embedding space produces
better results compared to other sequence-based embedding methods.
Related papers
- Virtual Node Generation for Node Classification in Sparsely-Labeled Graphs [2.0060301665996016]
This paper presents a novel node generation method that infuses a small set of high-quality synthesized nodes into the graph as additional labeled nodes.
It is compatible with most popular graph pre-training (self-supervised learning), semi-supervised learning, and meta-learning methods.
Our Experiments demonstrate statistically significant performance improvements over 14 baselines on 10 publicly available datasets.
arXiv Detail & Related papers (2024-09-12T02:36:44Z) - RandAlign: A Parameter-Free Method for Regularizing Graph Convolutional Networks [13.83680253264399]
We propose RandAlign, a regularization method for graph convolutional networks.
The idea of RandAlign is to randomly align the learned embedding for each node with that of the previous layer.
We experimentally evaluate RandAlign on different graph domain tasks on seven benchmark datasets.
arXiv Detail & Related papers (2024-04-15T13:28:13Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Learning Heuristics for the Maximum Clique Enumeration Problem Using Low
Dimensional Representations [0.0]
We use a learning framework for a pruning process of the input graph towards reducing the clique of the maximum enumeration problem.
We study the role of using different vertex representations on the performance of this runtime method.
We observe that using local graph features in the classification process produce more accurate results when combined with a feature elimination process.
arXiv Detail & Related papers (2022-10-30T22:04:32Z) - Digraphwave: Scalable Extraction of Structural Node Embeddings via
Diffusion on Directed Graphs [20.432261314154804]
Digraphwave is a scalable algorithm for extracting structural node embeddings on directed graphs.
The two embedding enhancements, named transposition and aggregation, are shown to lead to a significant increase in macro F1 score for classifying automorphic identities.
arXiv Detail & Related papers (2022-07-20T19:03:35Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Multilayer Graph Clustering with Optimized Node Embedding [70.1053472751897]
multilayer graph clustering aims at dividing the graph nodes into categories or communities.
We propose a clustering-friendly embedding of the layers of a given multilayer graph.
Experiments show that our method leads to a significant improvement.
arXiv Detail & Related papers (2021-03-30T17:36:40Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.