Entropy Neural Estimation for Graph Contrastive Learning
- URL: http://arxiv.org/abs/2307.13944v1
- Date: Wed, 26 Jul 2023 03:55:08 GMT
- Title: Entropy Neural Estimation for Graph Contrastive Learning
- Authors: Yixuan Ma, Xiaolin Zhang, Peng Zhang, Kun Zhan
- Abstract summary: Contrastive learning on graphs aims at extracting distinguishable high-level representations of nodes.
We propose a simple yet effective subset sampling strategy to contrast pairwise representations between views of a dataset.
We conduct extensive experiments on seven graph benchmarks, and the proposed approach achieves competitive performance.
- Score: 9.032721248598088
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contrastive learning on graphs aims at extracting distinguishable high-level
representations of nodes. In this paper, we theoretically illustrate that the
entropy of a dataset can be approximated by maximizing the lower bound of the
mutual information across different views of a graph, \ie, entropy is estimated
by a neural network. Based on this finding, we propose a simple yet effective
subset sampling strategy to contrast pairwise representations between views of
a dataset. In particular, we randomly sample nodes and edges from a given graph
to build the input subset for a view. Two views are fed into a parameter-shared
Siamese network to extract the high-dimensional embeddings and estimate the
information entropy of the entire graph. For the learning process, we propose
to optimize the network using two objectives, simultaneously. Concretely, the
input of the contrastive loss function consists of positive and negative pairs.
Our selection strategy of pairs is different from previous works and we present
a novel strategy to enhance the representation ability of the graph encoder by
selecting nodes based on cross-view similarities. We enrich the diversity of
the positive and negative pairs by selecting highly similar samples and totally
different data with the guidance of cross-view similarity scores, respectively.
We also introduce a cross-view consistency constraint on the representations
generated from the different views. This objective guarantees the learned
representations are consistent across views from the perspective of the entire
graph. We conduct extensive experiments on seven graph benchmarks, and the
proposed approach achieves competitive performance compared to the current
state-of-the-art methods. The source code will be publicly released once this
paper is accepted.
Related papers
- A Simple and Scalable Graph Neural Network for Large Directed Graphs [11.792826520370774]
We investigate various combinations of node representations and edge direction awareness within an input graph.
In response, we propose a simple yet holistic classification method A2DUG.
We demonstrate that A2DUG stably performs well on various datasets and improves the accuracy up to 11.29 compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-06-14T06:24:58Z) - Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching [68.35685422301613]
We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
arXiv Detail & Related papers (2023-01-07T05:14:45Z) - Line Graph Contrastive Learning for Link Prediction [4.876567687745239]
We propose a Line Graph Contrastive Learning (LGCL) method to obtain multiview information.
With experiments on six public datasets, LGCL outperforms current benchmarks on link prediction tasks.
arXiv Detail & Related papers (2022-10-25T06:57:00Z) - Generative Subgraph Contrast for Self-Supervised Graph Representation
Learning [16.374143635724327]
We propose a novel adaptive subgraph generation based contrastive learning framework for efficient and robust self-supervised graph representation learning.
It aims to generate contrastive samples by capturing the intrinsic structures of the graph and distinguish the samples based on the features and structures of subgraphs simultaneously.
arXiv Detail & Related papers (2022-07-25T09:08:46Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - Collaborative likelihood-ratio estimation over graphs [55.98760097296213]
Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We develop this idea in a concrete non-parametric method that we call Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We derive convergence rates for our collaborative approach that highlights the role played by variables such as the number of available observations per node, the size of the graph, and how accurately the graph structure encodes the similarity between tasks.
arXiv Detail & Related papers (2022-05-28T15:37:03Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Group Contrastive Self-Supervised Learning on Graphs [101.45974132613293]
We study self-supervised learning on graphs using contrastive methods.
We argue that contrasting graphs in multiple subspaces enables graph encoders to capture more abundant characteristics.
arXiv Detail & Related papers (2021-07-20T22:09:21Z) - Self-Supervised Graph Learning with Proximity-based Views and Channel
Contrast [4.761137180081091]
Graph neural networks (GNNs) use neighborhood aggregation as a core component that results in feature smoothing among nodes in proximity.
To tackle this problem, we strengthen the graph with two additional graph views, in which nodes are directly linked to those with the most similar features or local structures.
We propose a method that aims to maximize the agreement between representations across generated views and the original graph.
arXiv Detail & Related papers (2021-06-07T15:38:36Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.