An unsupervised cluster-level based method for learning node
representations of heterogeneous graphs in scientific papers
- URL: http://arxiv.org/abs/2203.16751v1
- Date: Thu, 31 Mar 2022 02:13:39 GMT
- Title: An unsupervised cluster-level based method for learning node
representations of heterogeneous graphs in scientific papers
- Authors: Jie Song and Meiyu Liang and Zhe Xue and Junping Du and Kou Feifei
- Abstract summary: This paper proposes an unsupervised cluster-level scientific paper heterogeneous graph node representation learning method (UCHL)
Based on the heterogeneous graph representation, this paper performs link prediction on the entire heterogeneous graph and obtains the relationship between the edges of the nodes, that is, the relationship between papers and papers.
- Score: 16.019656763017004
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning knowledge representation of scientific paper data is a problem to be
solved, and how to learn the representation of paper nodes in scientific paper
heterogeneous network is the core to solve this problem. This paper proposes an
unsupervised cluster-level scientific paper heterogeneous graph node
representation learning method (UCHL), aiming at obtaining the representation
of nodes (authors, institutions, papers, etc.) in the heterogeneous graph of
scientific papers. Based on the heterogeneous graph representation, this paper
performs link prediction on the entire heterogeneous graph and obtains the
relationship between the edges of the nodes, that is, the relationship between
papers and papers. Experiments results show that the proposed method achieves
excellent performance on multiple evaluation metrics on real scientific paper
datasets.
Related papers
- Leveraging Invariant Principle for Heterophilic Graph Structure Distribution Shifts [42.77503881972965]
Heterophilic Graph Neural Networks (HGNNs) have shown promising results for semi-supervised learning tasks on graphs.
How to learn invariant node representations on heterophilic graphs to handle this structure difference or distribution shifts remains unexplored.
We propose textbfHEI, a framework capable of generating invariant node representations through incorporating heterophily information.
arXiv Detail & Related papers (2024-08-18T14:10:34Z) - Contrastive Hierarchical Discourse Graph for Scientific Document
Summarization [14.930704950433324]
CHANGES is a contrastive hierarchical graph neural network for extractive scientific paper summarization.
We also propose a graph contrastive learning module to learn global theme-aware sentence representations.
arXiv Detail & Related papers (2023-05-31T20:54:43Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Graph Contrastive Learning under Heterophily via Graph Filters [51.46061703680498]
Graph contrastive learning (CL) methods learn node representations in a self-supervised manner by maximizing the similarity between the augmented node representations obtained via a GNN-based encoder.
In this work, we propose an effective graph CL method, namely HLCL, for learning graph representations under heterophily.
Our extensive experiments show that HLCL outperforms state-of-the-art graph CL methods on benchmark datasets with heterophily, as well as large-scale real-world graphs, by up to 7%, and outperforms graph supervised learning methods on datasets with heterophily by up to 10%.
arXiv Detail & Related papers (2023-03-11T08:32:39Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Neural Topic Modeling by Incorporating Document Relationship Graph [18.692100955163713]
Graph Topic Model (GTM) is a GNN based neural topic model that represents a corpus as a document relationship graph.
Documents and words in the corpus become nodes in the graph and are connected based on document-word co-occurrences.
arXiv Detail & Related papers (2020-09-29T12:45:55Z) - Semi-Supervised Node Classification by Graph Convolutional Networks and
Extracted Side Information [18.07347677181108]
This paper revisits the node classification task in a semi-supervised scenario by graph convolutional networks (GCNs)
The goal is to benefit from the flow of information that circulates around the revealed node labels.
arXiv Detail & Related papers (2020-09-29T02:38:58Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.