Learning Relation-Specific Representations for Few-shot Knowledge Graph
Completion
- URL: http://arxiv.org/abs/2203.11639v1
- Date: Tue, 22 Mar 2022 11:45:48 GMT
- Title: Learning Relation-Specific Representations for Few-shot Knowledge Graph
Completion
- Authors: Yuling Li, Kui Yu, Yuhong Zhang, and Xindong Wu
- Abstract summary: We propose a Relation-Specific Context Learning framework, which exploits graph contexts of triples to capture semantic information of relations and entities simultaneously.
Experimental results on two public datasets demonstrate that RSCL outperforms state-of-the-art FKGC methods.
- Score: 24.880078645503417
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent years have witnessed increasing interest in few-shot knowledge graph
completion (FKGC), which aims to infer unseen query triples for a few-shot
relation using a handful of reference triples of the relation. The primary
focus of existing FKGC methods lies in learning the relation representations
that can reflect the common information shared by the query and reference
triples. To this end, these methods learn the embeddings of entities with their
direct neighbors, and use the concatenation of the entity embeddings as the
relation representations. However, the entity embeddings learned only from
direct neighborhoods may have low expressiveness when the entity has sparse
neighbors or shares a common local neighborhood with other entities. Moreover,
the embeddings of two entities are insufficient to represent the semantic
information of their relationship, especially when they have multiple
relations. To address these issues, we propose a Relation-Specific Context
Learning (RSCL) framework, which exploits graph contexts of triples to capture
the semantic information of relations and entities simultaneously.
Specifically, we first extract graph contexts for each triple, which can
provide long-term entity-relation dependencies. To model the graph contexts, we
then develop a hierarchical relation-specific learner to learn global and local
relation-specific representations for relations by capturing contextualized
information of triples and incorporating local information of entities.
Finally, we utilize the learned representations to predict the likelihood of
the query triples. Experimental results on two public datasets demonstrate that
RSCL outperforms state-of-the-art FKGC methods.
Related papers
- OneNet: A Fine-Tuning Free Framework for Few-Shot Entity Linking via Large Language Model Prompting [49.655711022673046]
OneNet is an innovative framework that utilizes the few-shot learning capabilities of Large Language Models (LLMs) without the need for fine-tuning.
OneNet is structured around three key components prompted by LLMs: (1) an entity reduction processor that simplifies inputs by summarizing and filtering out irrelevant entities, (2) a dual-perspective entity linker that combines contextual cues and prior knowledge for precise entity linking, and (3) an entity consensus judger that employs a unique consistency algorithm to alleviate the hallucination in the entity linking reasoning.
arXiv Detail & Related papers (2024-10-10T02:45:23Z) - Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Mutually Guided Few-shot Learning for Relational Triple Extraction [10.539566491939844]
Mutually Guided Few-shot learning framework for Triple Extraction (MG-FTE)
Our method consists of an entity-guided relation-decoder to classify relations and a proto-decoder to extract entities.
Our method outperforms many state-of-the-art methods by 12.6 F1 score on FewRel 1.0 (single domain) and 20.5 F1 score on FewRel 2.0 (cross-domain)
arXiv Detail & Related papers (2023-06-23T06:15:54Z) - More than Classification: A Unified Framework for Event Temporal
Relation Extraction [61.44799147458621]
Event temporal relation extraction(ETRE) is usually formulated as a multi-label classification task.
We observe that all relations can be interpreted using the start and end time points of events.
We propose a unified event temporal relation extraction framework, which transforms temporal relations into logical expressions of time points.
arXiv Detail & Related papers (2023-05-28T02:09:08Z) - Proton: Probing Schema Linking Information from Pre-trained Language
Models for Text-to-SQL Parsing [66.55478402233399]
We propose a framework to elicit relational structures via a probing procedure based on Poincar'e distance metric.
Compared with commonly-used rule-based methods for schema linking, we found that probing relations can robustly capture semantic correspondences.
Our framework sets new state-of-the-art performance on three benchmarks.
arXiv Detail & Related papers (2022-06-28T14:05:25Z) - RAGA: Relation-aware Graph Attention Networks for Global Entity
Alignment [14.287681294725438]
We propose a novel framework based on Relation-aware Graph Attention Networks to capture the interactions between entities and relations.
Our framework adopts the self-attention mechanism to spread entity information to the relations and then aggregate relation information back to entities.
arXiv Detail & Related papers (2021-03-01T06:30:51Z) - Context-Enhanced Entity and Relation Embedding for Knowledge Graph
Completion [2.580765958706854]
We propose a model named AggrE, which conducts efficient aggregations on entity context and relation context in multi-hops.
Experiment results show that AggrE is competitive to existing models.
arXiv Detail & Related papers (2020-12-13T09:20:42Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Relation of the Relations: A New Paradigm of the Relation Extraction
Problem [52.21210549224131]
We propose a new paradigm of Relation Extraction (RE) that considers as a whole the predictions of all relations in the same context.
We develop a data-driven approach that does not require hand-crafted rules but learns by itself the relation of relations (RoR) using Graph Neural Networks and a relation matrix transformer.
Experiments show that our model outperforms the state-of-the-art approaches by +1.12% on the ACE05 dataset and +2.55% on SemEval 2018 Task 7.2.
arXiv Detail & Related papers (2020-06-05T22:25:27Z) - On Embeddings in Relational Databases [11.52782249184251]
We address the problem of learning a distributed representation of entities in a relational database using a low-dimensional embedding.
Recent methods for learning embedding constitute of a naive approach to consider complete denormalization of the database by relationalizing the full join of all tables and representing as a knowledge graph.
In this paper we demonstrate; a better methodology for learning representations by exploiting the underlying semantics of columns in a table while using the relation joins and the latent inter-row relationships.
arXiv Detail & Related papers (2020-05-13T17:21:27Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.