Line Graph Contrastive Learning for Link Prediction
- URL: http://arxiv.org/abs/2210.13795v1
- Date: Tue, 25 Oct 2022 06:57:00 GMT
- Title: Line Graph Contrastive Learning for Link Prediction
- Authors: Zehua Zhang, Shilin Sun, Guixiang Ma, Caiming Zhong
- Abstract summary: We propose a Line Graph Contrastive Learning (LGCL) method to obtain multiview information.
With experiments on six public datasets, LGCL outperforms current benchmarks on link prediction tasks.
- Score: 4.876567687745239
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Link prediction task aims to predict the connection of two nodes in the
network. Existing works mainly predict links by node pairs similarity
measurements. However, if the local structure doesn't meet such measurement
assumption, the algorithms' performance will deteriorate rapidly. To overcome
these limitations, we propose a Line Graph Contrastive Learning (LGCL) method
to obtain multiview information. Our framework obtains a subgraph view by h-hop
subgraph sampling with target node pairs as the center. After transforming the
sampled subgraph into a line graph, the edge embedding information is directly
accessible, and the link prediction task is converted into a node
classification task. Then, different graph convolution operators learn
representations from double perspectives. Finally, contrastive learning is
adopted to balance the subgraph representations of these perspectives via
maximizing mutual information. With experiments on six public datasets, LGCL
outperforms current benchmarks on link prediction tasks and shows better
generalization performance and robustness.
Related papers
- Local Structure-aware Graph Contrastive Representation Learning [12.554113138406688]
We propose a Local Structure-aware Graph Contrastive representation Learning method (LS-GCL) to model the structural information of nodes from multiple views.
For the local view, the semantic subgraph of each target node is input into a shared GNN encoder to obtain the target node embeddings at the subgraph-level.
For the global view, considering the original graph preserves indispensable semantic information of nodes, we leverage the shared GNN encoder to learn the target node embeddings at the global graph-level.
arXiv Detail & Related papers (2023-08-07T03:23:46Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Entropy Neural Estimation for Graph Contrastive Learning [9.032721248598088]
Contrastive learning on graphs aims at extracting distinguishable high-level representations of nodes.
We propose a simple yet effective subset sampling strategy to contrast pairwise representations between views of a dataset.
We conduct extensive experiments on seven graph benchmarks, and the proposed approach achieves competitive performance.
arXiv Detail & Related papers (2023-07-26T03:55:08Z) - Subgraph Networks Based Contrastive Learning [5.736011243152416]
Graph contrastive learning (GCL) can solve the problem of annotated data scarcity.
Most existing GCL methods focus on the design of graph augmentation strategies and mutual information estimation operations.
We propose a novel framework called subgraph network-based contrastive learning (SGNCL)
arXiv Detail & Related papers (2023-06-06T08:52:44Z) - NESS: Node Embeddings from Static SubGraphs [0.0]
We present a framework for learning Node Embeddings from Static Subgraphs (NESS) using a graph autoencoder (GAE) in a transductive setting.
NESS is based on two key ideas: i) Partitioning the training graph to multiple static, sparse subgraphs with non-overlapping edges using random edge split during data pre-processing.
We demonstrate that NESS gives a better node representation for link prediction tasks compared to current autoencoding methods that use either the whole graph or subgraphs.
arXiv Detail & Related papers (2023-03-15T22:14:28Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Collaborative likelihood-ratio estimation over graphs [55.98760097296213]
Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We develop this idea in a concrete non-parametric method that we call Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We derive convergence rates for our collaborative approach that highlights the role played by variables such as the number of available observations per node, the size of the graph, and how accurately the graph structure encodes the similarity between tasks.
arXiv Detail & Related papers (2022-05-28T15:37:03Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Which way? Direction-Aware Attributed Graph Embedding [2.429993132301275]
Graph embedding algorithms are used to efficiently represent a graph in a continuous vector space.
One aspect that is often overlooked is whether the graph is directed or not.
This study presents a novel text-enriched, direction-aware algorithm called DIAGRAM.
arXiv Detail & Related papers (2020-01-30T13:08:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.