Investigating Extensions to Random Walk Based Graph Embedding
- URL: http://arxiv.org/abs/2002.07252v1
- Date: Mon, 17 Feb 2020 21:14:02 GMT
- Title: Investigating Extensions to Random Walk Based Graph Embedding
- Authors: Joerg Schloetterer, Martin Wehking, Fatemeh Salehi Rizi, Michael
Granitzer
- Abstract summary: We propose a novel extension to random walk based graph embedding, which removes a percentage of least frequent nodes from the walks at different levels.
By this removal, we simulate farther distant nodes to reside in the close neighborhood of a node and hence explicitly represent their connection.
The results indicate, that extensions to random walk based methods (including our own) improve the predictive performance only slightly - if at all.
- Score: 0.3867052484157571
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph embedding has recently gained momentum in the research community, in
particular after the introduction of random walk and neural network based
approaches. However, most of the embedding approaches focus on representing the
local neighborhood of nodes and fail to capture the global graph structure,
i.e. to retain the relations to distant nodes. To counter that problem, we
propose a novel extension to random walk based graph embedding, which removes a
percentage of least frequent nodes from the walks at different levels. By this
removal, we simulate farther distant nodes to reside in the close neighborhood
of a node and hence explicitly represent their connection. Besides the common
evaluation tasks for graph embeddings, such as node classification and link
prediction, we evaluate and compare our approach against related methods on
shortest path approximation. The results indicate, that extensions to random
walk based methods (including our own) improve the predictive performance only
slightly - if at all.
Related papers
- Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - Half-Hop: A graph upsampling approach for slowing down message passing [31.26080679115766]
We introduce a framework for improving learning in message passing neural networks.
Our approach essentially upsamples edges in the original graph by adding "slow nodes" at each edge.
Our method only modifies the input graph, making it plug-and-play and easy to use with existing models.
arXiv Detail & Related papers (2023-08-17T22:24:15Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Visiting Distant Neighbors in Graph Convolutional Networks [0.0]
We extend the graph convolutional network method for deep learning on graph data to higher order in terms of neighboring nodes.
We show that this higher order neighbor visiting pays off by outperforming the original model.
arXiv Detail & Related papers (2023-01-26T06:37:11Z) - Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching [68.35685422301613]
We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
arXiv Detail & Related papers (2023-01-07T05:14:45Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Detecting Communities from Heterogeneous Graphs: A Context Path-based
Graph Neural Network Model [23.525079144108567]
We build a Context Path-based Graph Neural Network (CP-GNN) model.
It embeds the high-order relationship between nodes into the node embedding.
It outperforms the state-of-the-art community detection methods.
arXiv Detail & Related papers (2021-09-05T12:28:00Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Learning Representations using Spectral-Biased Random Walks on Graphs [18.369974607582584]
We study how much a probabilistic bias in this process affects the quality of the nodes picked by the process.
We succinctly capture this neighborhood as a probability measure based on the spectrum of the node's neighborhood subgraph represented as a normalized laplacian matrix.
We empirically evaluate our approach against several state-of-the-art node embedding techniques on a wide variety of real-world datasets.
arXiv Detail & Related papers (2020-05-19T20:42:43Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.