Neighbor2vec: an efficient and effective method for Graph Embedding
- URL: http://arxiv.org/abs/2201.02626v1
- Date: Fri, 7 Jan 2022 16:08:26 GMT
- Title: Neighbor2vec: an efficient and effective method for Graph Embedding
- Authors: Zhiming Lin
- Abstract summary: Neighbor2vec is a framework to gather structure information by feature propagation between the node and its neighbors.
We conduct experiments on several node classification and link prediction tasks for networks.
Neighbor2vec's representations provide an average accuracy scores up to 6.8 percent higher than competing methods.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph embedding techniques have led to significant progress in recent years.
However, present techniques are not effective enough to capture the patterns of
networks. This paper propose neighbor2vec, a neighbor-based sampling strategy
used algorithm to learn the neighborhood representations of node, a framework
to gather the structure information by feature propagation between the node and
its neighbors. We claim that neighbor2vec is a simple and effective approach to
enhancing the scalability as well as equality of graph embedding, and it breaks
the limits of the existing state-of-the-art unsupervised techniques. We conduct
experiments on several node classification and link prediction tasks for
networks such as ogbn-arxiv, ogbn-products, ogbn-proteins, ogbl-ppa,ogbl-collab
and ogbl-citation2. The result shows that Neighbor2vec's representations
provide an average accuracy scores up to 6.8 percent higher than competing
methods in node classification tasks and 3.0 percent higher in link prediction
tasks. The neighbor2vec's representations are able to outperform all baseline
methods and two classical GNN models in all six experiments.
Related papers
- Node Embeddings via Neighbor Embeddings [11.841966603069865]
We introduce graph t-SNE and graph CNE, a contrastive neighbor embedding method that produces high-dimensional node representations.
We show that both graph t-SNE and graph CNE strongly outperform state-of-the-art algorithms in terms of local structure preservation.
arXiv Detail & Related papers (2025-03-31T08:16:03Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Nonlinear Correct and Smooth for Semi-Supervised Learning [1.622641093702668]
Graph-based semi-supervised learning (GSSL) has been used successfully in various applications.
We propose Correct and Smooth (NLCS), which improves the existing post-processing approach by incorporating non-linearity and higher-order representation.
arXiv Detail & Related papers (2023-10-09T14:33:32Z) - Neighborhood Homophily-based Graph Convolutional Network [4.511171093050241]
Graph neural networks (GNNs) have been proved powerful in graph-oriented tasks.
Many real-world graphs are heterophilous, challenging the homophily assumption of classical GNNs.
Recent studies propose new metrics to characterize the homophily, but rarely consider the correlation of the proposed metrics and models.
In this paper, we first design a new metric, Neighborhood Homophily (textitNH), to measure the label complexity or purity in node neighborhoods.
arXiv Detail & Related papers (2023-01-24T07:56:44Z) - 2-hop Neighbor Class Similarity (2NCS): A graph structural metric
indicative of graph neural network performance [4.051099980410583]
Graph Neural Networks (GNNs) achieve state-of-the-art performance on graph-structured data across numerous domains.
On heterophilous graphs, in which different-type nodes are likely connected, GNNs perform less consistently.
We introduce 2-hop Neighbor Class Similarity (2NCS), a new quantitative graph structural property that correlates with GNN performance more strongly and consistently than alternative metrics.
arXiv Detail & Related papers (2022-12-26T16:16:51Z) - Mixed Graph Contrastive Network for Semi-Supervised Node Classification [63.924129159538076]
We propose a novel graph contrastive learning method, termed Mixed Graph Contrastive Network (MGCN)
In our method, we improve the discriminative capability of the latent embeddings by an unperturbed augmentation strategy and a correlation reduction mechanism.
By combining the two settings, we extract rich supervision information from both the abundant nodes and the rare yet valuable labeled nodes for discriminative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Noise-robust Graph Learning by Estimating and Leveraging Pairwise
Interactions [123.07967420310796]
This paper bridges the gap by proposing a pairwise framework for noisy node classification on graphs.
PI-GNN relies on the PI as a primary learning proxy in addition to the pointwise learning from the noisy node class labels.
Our proposed framework PI-GNN contributes two novel components: (1) a confidence-aware PI estimation model that adaptively estimates the PI labels, and (2) a decoupled training approach that leverages the estimated PI labels.
arXiv Detail & Related papers (2021-06-14T14:23:08Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z) - Graph Neighborhood Attentive Pooling [0.5493410630077189]
Network representation learning (NRL) is a powerful technique for learning low-dimensional vector representation of high-dimensional and sparse graphs.
We propose a novel context-sensitive algorithm called GAP that learns to attend on different parts of a node's neighborhood using attentive pooling networks.
arXiv Detail & Related papers (2020-01-28T15:05:48Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.