Knowledge Graph Self-Supervised Rationalization for Recommendation
- URL: http://arxiv.org/abs/2307.02759v1
- Date: Thu, 6 Jul 2023 03:44:40 GMT
- Title: Knowledge Graph Self-Supervised Rationalization for Recommendation
- Authors: Yuhao Yang, Chao Huang, Lianghao Xia, Chunzhen Huang
- Abstract summary: We introduce a new self-supervised rationalization method, called KGRec, for knowledge-aware recommender systems.
We propose an attentive knowledge rationalization mechanism that generates rational scores for knowledge triplets.
Experiments on three real-world datasets demonstrate that KGRec outperforms state-of-the-art methods.
- Score: 9.591334756455968
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce a new self-supervised rationalization method,
called KGRec, for knowledge-aware recommender systems. To effectively identify
informative knowledge connections, we propose an attentive knowledge
rationalization mechanism that generates rational scores for knowledge
triplets. With these scores, KGRec integrates generative and contrastive
self-supervised tasks for recommendation through rational masking. To highlight
rationales in the knowledge graph, we design a novel generative task in the
form of masking-reconstructing. By masking important knowledge with high
rational scores, KGRec is trained to rebuild and highlight useful knowledge
connections that serve as rationales. To further rationalize the effect of
collaborative interactions on knowledge graph learning, we introduce a
contrastive learning task that aligns signals from knowledge and user-item
interaction views. To ensure noise-resistant contrasting, potential noisy edges
in both graphs judged by the rational scores are masked. Extensive experiments
on three real-world datasets demonstrate that KGRec outperforms
state-of-the-art methods. We also provide the implementation codes for our
approach at https://github.com/HKUDS/KGRec.
Related papers
- Knowledge Graph Pruning for Recommendation [44.21660467094777]
We propose a novel approach called KGTrimmer for knowledge graph pruning tailored for recommendation.
For the collective view, we embrace the idea of collective intelligence by extracting community consensus based on abundant collaborative signals.
Next, we build an end-to-end importance-aware graph neural network, which injects filtered knowledge to enhance the distillation of valuable user-item collaborative signals.
arXiv Detail & Related papers (2024-05-19T12:07:24Z) - On the Sweet Spot of Contrastive Views for Knowledge-enhanced
Recommendation [49.18304766331156]
We propose a new contrastive learning framework for KG-enhanced recommendation.
We construct two separate contrastive views for KG and IG, and maximize their mutual information.
Extensive experimental results on three real-world datasets demonstrate the effectiveness and efficiency of our method.
arXiv Detail & Related papers (2023-09-23T14:05:55Z) - Knowledge Distillation via Token-level Relationship Graph [12.356770685214498]
We propose a novel method called Knowledge Distillation with Token-level Relationship Graph (TRG)
By employing TRG, the student model can effectively emulate higher-level semantic information from the teacher model.
We conduct experiments to evaluate the effectiveness of the proposed method against several state-of-the-art approaches.
arXiv Detail & Related papers (2023-06-20T08:16:37Z) - Similarity-weighted Construction of Contextualized Commonsense Knowledge
Graphs for Knowledge-intense Argumentation Tasks [17.438104235331085]
We present a new unsupervised method for constructing Contextualized Commonsense Knowledge Graphs (CCKGs)
Our work goes beyond context-insensitive knowledge extractions by computing semantic similarity between KG triplets and textual arguments.
We demonstrate the effectiveness of CCKGs in a knowledge-insensitive argument quality rating task, outperforming strong baselines and rivaling a GPT-3 based system.
arXiv Detail & Related papers (2023-05-15T09:52:36Z) - Conditional Attention Networks for Distilling Knowledge Graphs in
Recommendation [74.14009444678031]
We propose Knowledge-aware Conditional Attention Networks (KCAN) to incorporate knowledge graph into a recommender system.
We use a knowledge-aware attention propagation manner to obtain the node representation first, which captures the global semantic similarity on the user-item network and the knowledge graph.
Then, by applying a conditional attention aggregation on the subgraph, we refine the knowledge graph to obtain target-specific node representations.
arXiv Detail & Related papers (2021-11-03T09:40:43Z) - Distilling Holistic Knowledge with Graph Neural Networks [37.86539695906857]
Knowledge Distillation (KD) aims at transferring knowledge from a larger well-optimized teacher network to a smaller learnable student network.
Existing KD methods have mainly considered two types of knowledge, namely the individual knowledge and the relational knowledge.
We propose to distill the novel holistic knowledge based on an attributed graph constructed among instances.
arXiv Detail & Related papers (2021-08-12T02:47:59Z) - KompaRe: A Knowledge Graph Comparative Reasoning System [85.72488258453926]
This paper introduces comparative reasoning over knowledge graphs, which aims to infer the commonality and inconsistency with respect to multiple pieces of clues.
We develop KompaRe, the first of its kind prototype system that provides comparative reasoning capability over large knowledge graphs.
arXiv Detail & Related papers (2020-11-06T04:57:37Z) - All About Knowledge Graphs for Actions [82.39684757372075]
We propose a better understanding of knowledge graphs (KGs) that can be utilized for zero-shot and few-shot action recognition.
We study three different construction mechanisms for KGs: action embeddings, action-object embeddings, visual embeddings.
We present extensive analysis of the impact of different KGs on different experimental setups.
arXiv Detail & Related papers (2020-08-28T01:44:01Z) - Knowledge Distillation Meets Self-Supervision [109.6400639148393]
Knowledge distillation involves extracting "dark knowledge" from a teacher network to guide the learning of a student network.
We show that the seemingly different self-supervision task can serve as a simple yet powerful solution.
By exploiting the similarity between those self-supervision signals as an auxiliary task, one can effectively transfer the hidden information from the teacher to the student.
arXiv Detail & Related papers (2020-06-12T12:18:52Z) - Dynamic Knowledge embedding and tracing [18.717482292051788]
We propose a novel approach to knowledge tracing that combines techniques from matrix factorization with recent progress in recurrent neural networks (RNNs)
The proposed emphDynEmb framework enables the tracking of student knowledge even without the concept/skill tag information.
arXiv Detail & Related papers (2020-05-18T21:56:42Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.