Relation-aware Graph Attention Model With Adaptive Self-adversarial
Training
- URL: http://arxiv.org/abs/2102.07186v1
- Date: Sun, 14 Feb 2021 16:11:56 GMT
- Title: Relation-aware Graph Attention Model With Adaptive Self-adversarial
Training
- Authors: Xiao Qin, Nasrullah Sheikh, Berthold Reinwald, Lingfei Wu
- Abstract summary: This paper describes an end-to-end solution for the relationship prediction task in heterogeneous, multi-relational graphs.
We particularly address two building blocks in the pipeline, namely heterogeneous graph representation learning and negative sampling.
We introduce a parameter-free negative sampling technique -- adaptive self-adversarial (ASA) negative sampling.
- Score: 29.240686573485718
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper describes an end-to-end solution for the relationship prediction
task in heterogeneous, multi-relational graphs. We particularly address two
building blocks in the pipeline, namely heterogeneous graph representation
learning and negative sampling. Existing message passing-based graph neural
networks use edges either for graph traversal and/or selection of message
encoding functions. Ignoring the edge semantics could have severe repercussions
on the quality of embeddings, especially when dealing with two nodes having
multiple relations. Furthermore, the expressivity of the learned representation
depends on the quality of negative samples used during training. Although
existing hard negative sampling techniques can identify challenging negative
relationships for optimization, new techniques are required to control false
negatives during training as false negatives could corrupt the learning
process. To address these issues, first, we propose RelGNN -- a message
passing-based heterogeneous graph attention model. In particular, RelGNN
generates the states of different relations and leverages them along with the
node states to weigh the messages. RelGNN also adopts a self-attention
mechanism to balance the importance of attribute features and topological
features for generating the final entity embeddings. Second, we introduce a
parameter-free negative sampling technique -- adaptive self-adversarial (ASA)
negative sampling. ASA reduces the false-negative rate by leveraging positive
relationships to effectively guide the identification of true negative samples.
Our experimental evaluation demonstrates that RelGNN optimized by ASA for
relationship prediction improves state-of-the-art performance across
established benchmarks as well as on a real industrial dataset.
Related papers
- Self-Supervised Conditional Distribution Learning on Graphs [15.730933577970687]
We present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
This alignment effectively reduces the risk of disrupting intrinsic semantic information through graph-structured data augmentation.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - Curriculum Negative Mining For Temporal Networks [33.70909189731187]
Temporal networks are effective in capturing the evolving interactions of networks over time.
CurNM is a model-aware curriculum learning framework that adaptively adjusts the difficulty of negative samples.
Our method outperforms baseline methods by a significant margin.
arXiv Detail & Related papers (2024-07-24T07:55:49Z) - Diffusion-based Negative Sampling on Graphs for Link Prediction [8.691564173331924]
Link prediction is a fundamental task for graph analysis with important applications on the Web, such as social network analysis and recommendation systems.
We propose a novel strategy of multi-level negative sampling that enables negative node generation with flexible and controllable hardness'' levels from the latent space.
Our method, called Conditional Diffusion-based Multi-level Negative Sampling (DMNS), leverages the Markov chain property of diffusion models to generate negative nodes in multiple levels of variable hardness.
arXiv Detail & Related papers (2024-03-25T23:07:31Z) - Efficient Link Prediction via GNN Layers Induced by Negative Sampling [92.05291395292537]
Graph neural networks (GNNs) for link prediction can loosely be divided into two broad categories.
First, emphnode-wise architectures pre-compute individual embeddings for each node that are later combined by a simple decoder to make predictions.
Second, emphedge-wise methods rely on the formation of edge-specific subgraph embeddings to enrich the representation of pair-wise relationships.
arXiv Detail & Related papers (2023-10-14T07:02:54Z) - Relation-Aware Network with Attention-Based Loss for Few-Shot Knowledge
Graph Completion [9.181270251524866]
Current approaches randomly select one negative sample for each reference entity pair to minimize a margin-based ranking loss.
We propose a novel Relation-Aware Network with Attention-Based Loss framework.
Experiments demonstrate that RANA outperforms the state-of-the-art models on two benchmark datasets.
arXiv Detail & Related papers (2023-06-15T21:41:43Z) - Energy-based Out-of-Distribution Detection for Graph Neural Networks [76.0242218180483]
We propose a simple, powerful and efficient OOD detection model for GNN-based learning on graphs, which we call GNNSafe.
GNNSafe achieves up to $17.0%$ AUROC improvement over state-of-the-arts and it could serve as simple yet strong baselines in such an under-developed area.
arXiv Detail & Related papers (2023-02-06T16:38:43Z) - STERLING: Synergistic Representation Learning on Bipartite Graphs [78.86064828220613]
A fundamental challenge of bipartite graph representation learning is how to extract node embeddings.
Most recent bipartite graph SSL methods are based on contrastive learning which learns embeddings by discriminating positive and negative node pairs.
We introduce a novel synergistic representation learning model (STERLING) to learn node embeddings without negative node pairs.
arXiv Detail & Related papers (2023-01-25T03:21:42Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Structure Aware Negative Sampling in Knowledge Graphs [18.885368822313254]
A crucial aspect of contrastive learning approaches is the choice of corruption distribution that generates hard negative samples.
We propose Structure Aware Negative Sampling (SANS), an inexpensive negative sampling strategy that utilizes the rich graph structure by selecting negative samples from a node's k-hop neighborhood.
arXiv Detail & Related papers (2020-09-23T19:57:00Z) - Understanding Negative Sampling in Graph Representation Learning [87.35038268508414]
We show that negative sampling is as important as positive sampling in determining the optimization objective and the resulted variance.
We propose Metropolis-Hastings (MCNS) to approximate the positive distribution with self-contrast approximation and accelerate negative sampling by Metropolis-Hastings.
We evaluate our method on 5 datasets that cover extensive downstream graph learning tasks, including link prediction, node classification and personalized recommendation.
arXiv Detail & Related papers (2020-05-20T06:25:21Z) - Reinforced Negative Sampling over Knowledge Graph for Recommendation [106.07209348727564]
We develop a new negative sampling model, Knowledge Graph Policy Network (kgPolicy), which works as a reinforcement learning agent to explore high-quality negatives.
kgPolicy navigates from the target positive interaction, adaptively receives knowledge-aware negative signals, and ultimately yields a potential negative item to train the recommender.
arXiv Detail & Related papers (2020-03-12T12:44:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.