Link Prediction on Latent Heterogeneous Graphs
- URL: http://arxiv.org/abs/2302.10432v1
- Date: Tue, 21 Feb 2023 04:09:51 GMT
- Title: Link Prediction on Latent Heterogeneous Graphs
- Authors: Trung-Kien Nguyen, Zemin Liu, Yuan Fang
- Abstract summary: We study the challenging and unexplored problem of link prediction on a latent heterogeneous graph (LHG)
We propose a model named LHGNN, based on the novel idea of semantic embedding at node and path levels, to capture latent semantics on and between nodes.
We conduct extensive experiments on four benchmark datasets, and demonstrate the superior performance of LHGNN.
- Score: 18.110053023118294
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: On graph data, the multitude of node or edge types gives rise to
heterogeneous information networks (HINs). To preserve the heterogeneous
semantics on HINs, the rich node/edge types become a cornerstone of HIN
representation learning. However, in real-world scenarios, type information is
often noisy, missing or inaccessible. Assuming no type information is given, we
define a so-called latent heterogeneous graph (LHG), which carries latent
heterogeneous semantics as the node/edge types cannot be observed. In this
paper, we study the challenging and unexplored problem of link prediction on an
LHG. As existing approaches depend heavily on type-based information, they are
suboptimal or even inapplicable on LHGs. To address the absence of type
information, we propose a model named LHGNN, based on the novel idea of
semantic embedding at node and path levels, to capture latent semantics on and
between nodes. We further design a personalization function to modulate the
heterogeneous contexts conditioned on their latent semantics w.r.t. the target
node, to enable finer-grained aggregation. Finally, we conduct extensive
experiments on four benchmark datasets, and demonstrate the superior
performance of LHGNN.
Related papers
- DREAM: Dual-Standard Semantic Homogeneity with Dynamic Optimization for Graph Learning with Label Noise [53.55187452152358]
This paper proposes a novel method, Dual-Standard Semantic Homogeneity with Dynamic Optimization (DREAM) for reliable, relation-informed optimization on graphs with label noise.<n>Specifically, we design a relation-informed dynamic optimization framework that iteratively reevaluates the reliability of each labeled node in the graph.
arXiv Detail & Related papers (2026-01-24T12:54:18Z) - Beyond Fixed Depth: Adaptive Graph Neural Networks for Node Classification Under Varying Homophily [10.0426843232642]
We develop a theoretical framework that links local structural and label characteristics to information propagation dynamics.<n>We propose a novel adaptive-depth GNN architecture that dynamically selects node-specific aggregation depths.<n>Our method seamlessly adapts to both homophilic and heterophilic patterns within a unified model.
arXiv Detail & Related papers (2025-11-10T01:37:51Z) - Discrete Diffusion-Based Model-Level Explanation of Heterogeneous GNNs with Node Features [0.25782420501870296]
We present DiGNNExplainer, a model-level explanation approach that synthesizes heterogeneous graphs with realistic node features.<n>We evaluate our approach on multiple datasets and show that DiGNNExplainer produces explanations that are realistic and faithful to the model's decision-making.
arXiv Detail & Related papers (2025-08-11T20:33:10Z) - Multi-Granular Attention based Heterogeneous Hypergraph Neural Network [5.580244361093485]
Heterogeneous graph neural networks (HeteGNNs) have demonstrated strong abilities to learn node representations.<n>This paper proposes MGA-HHN, a Multi-Granular Attention based Heterogeneous Hypergraph Neural Network for representation learning.
arXiv Detail & Related papers (2025-05-07T11:42:00Z) - Leveraging Invariant Principle for Heterophilic Graph Structure Distribution Shifts [42.77503881972965]
Heterophilic Graph Neural Networks (HGNNs) have shown promising results for semi-supervised learning tasks on graphs.
We propose a framework capable of generating invariant node representations through incorporating heterophily information.
Our proposed method can achieve guaranteed performance under heterophilic graph structure distribution shifts.
arXiv Detail & Related papers (2024-08-18T14:10:34Z) - Learn from Heterophily: Heterophilous Information-enhanced Graph Neural Network [4.078409998614025]
Heterophily, nodes with different labels tend to be connected based on semantic meanings, Graph Neural Networks (GNNs) often exhibit suboptimal performance.
We propose and demonstrate that the valuable semantic information inherent in heterophily can be utilized effectively in graph learning.
We propose HiGNN, an innovative approach that constructs an additional new graph structure, that integrates heterophilous information by leveraging node distribution.
arXiv Detail & Related papers (2024-03-26T03:29:42Z) - Evolving Computation Graphs [20.094508902123778]
Graph neural networks (GNNs) have demonstrated success in modeling relational data, especially for data that exhibits homophily.
We propose Evolving Computation Graphs (ECGs), a novel method for enhancing GNNs on heterophilic datasets.
arXiv Detail & Related papers (2023-06-22T14:58:18Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Semantic-aware Node Synthesis for Imbalanced Heterogeneous Information
Networks [51.55932524129814]
We present the first method for the semantic imbalance problem in imbalanced HINs named Semantic-aware Node Synthesis (SNS)
SNS adaptively selects the heterogeneous neighbor nodes and augments the network with synthetic nodes while preserving the minority semantics.
We also introduce two regularization approaches for HGNNs that constrain the representation of synthetic nodes from both semantic and class perspectives.
arXiv Detail & Related papers (2023-02-27T00:21:43Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - HMSG: Heterogeneous Graph Neural Network based on Metapath Subgraph
Learning [2.096172374930129]
We propose a new heterogeneous graph neural network model named HMSG.
We decompose the heterogeneous graph into multiple subgraphs.
Each subgraph associates specific semantic and structural information.
Through a type-specific attribute transformation, node attributes can also be transferred among different types of nodes.
arXiv Detail & Related papers (2021-09-07T05:02:59Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Adaptive Universal Generalized PageRank Graph Neural Network [36.850433364139924]
Graph neural networks (GNNs) are designed to exploit both sources of evidence but they do not optimally trade-off their utility.
We introduce a new Generalized PageRank (GPR) GNN architecture that adaptively learns the GPR weights.
GPR-GNN offers significant performance improvement compared to existing techniques on both synthetic and benchmark data.
arXiv Detail & Related papers (2020-06-14T19:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.