AsyncET: Asynchronous Learning for Knowledge Graph Entity Typing with
Auxiliary Relations
- URL: http://arxiv.org/abs/2308.16055v1
- Date: Wed, 30 Aug 2023 14:24:16 GMT
- Title: AsyncET: Asynchronous Learning for Knowledge Graph Entity Typing with
Auxiliary Relations
- Authors: Yun-Cheng Wang, Xiou Ge, Bin Wang, C.-C. Jay Kuo
- Abstract summary: We improve the expressiveness of knowledge graph embedding (KGE) methods by introducing multiple auxiliary relations.
Similar entity types are grouped to reduce the number of auxiliary relations and improve their capability to model entity-type patterns with different granularities.
Experiments are conducted on two commonly used KGET datasets to show that the performance of KGE methods on the KGET task can be substantially improved.
- Score: 42.16033541753744
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Knowledge graph entity typing (KGET) is a task to predict the missing entity
types in knowledge graphs (KG). Previously, KG embedding (KGE) methods tried to
solve the KGET task by introducing an auxiliary relation, 'hasType', to model
the relationship between entities and their types. However, a single auxiliary
relation has limited expressiveness for diverse entity-type patterns. We
improve the expressiveness of KGE methods by introducing multiple auxiliary
relations in this work. Similar entity types are grouped to reduce the number
of auxiliary relations and improve their capability to model entity-type
patterns with different granularities. With the presence of multiple auxiliary
relations, we propose a method adopting an Asynchronous learning scheme for
Entity Typing, named AsyncET, which updates the entity and type embeddings
alternatively to keep the learned entity embedding up-to-date and informative
for entity type prediction. Experiments are conducted on two commonly used KGET
datasets to show that the performance of KGE methods on the KGET task can be
substantially improved by the proposed multiple auxiliary relations and
asynchronous embedding learning. Furthermore, our method has a significant
advantage over state-of-the-art methods in model sizes and time complexity.
Related papers
- Knowledge Graph Embeddings: A Comprehensive Survey on Capturing Relation Properties [5.651919225343915]
Knowledge Graph Embedding (KGE) techniques play a pivotal role in transforming symbolic Knowledge Graphs into numerical representations.
This paper addresses the complex mapping properties inherent in relations, such as one-to-one, one-to-many, many-to-one, and many-to-many mappings.
We explore innovative ideas such as integrating multimodal information into KGE, enhancing relation pattern modeling with rules, and developing models to capture relation characteristics in dynamic KGE settings.
arXiv Detail & Related papers (2024-10-16T08:54:52Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - Polar Ducks and Where to Find Them: Enhancing Entity Linking with Duck
Typing and Polar Box Embeddings [11.164501311947124]
DUCK is an approach to infusing structural information in the space of entity representations.
We define the type of an entity based on the relations that it has with other entities in a knowledge graph.
We optimize the model to cluster entities of similar type by placing them inside the boxes corresponding to their relations.
arXiv Detail & Related papers (2023-05-19T22:42:16Z) - Prototype-based Embedding Network for Scene Graph Generation [105.97836135784794]
Current Scene Graph Generation (SGG) methods explore contextual information to predict relationships among entity pairs.
Due to the diverse visual appearance of numerous possible subject-object combinations, there is a large intra-class variation within each predicate category.
Prototype-based Embedding Network (PE-Net) models entities/predicates with prototype-aligned compact and distinctive representations.
PL is introduced to help PE-Net efficiently learn such entitypredicate matching, and Prototype Regularization (PR) is devised to relieve the ambiguous entity-predicate matching.
arXiv Detail & Related papers (2023-03-13T13:30:59Z) - Entity Type Prediction Leveraging Graph Walks and Entity Descriptions [4.147346416230273]
textitGRAND is a novel approach for entity typing leveraging different graph walk strategies in RDF2vec together with textual entity descriptions.
The proposed approach outperforms the baseline approaches on the benchmark datasets DBpedia and FIGER for entity typing in KGs for both fine-grained and coarse-grained classes.
arXiv Detail & Related papers (2022-07-28T13:56:55Z) - Document-Level Relation Extraction with Reconstruction [28.593318203728963]
We propose a novel encoder-classifier-reconstructor model for document-level relation extraction (DocRE)
The reconstructor reconstructs the ground-truth path dependencies from the graph representation, to ensure that the proposed DocRE model pays more attention to encode entity pairs with relationships in the training.
Experimental results on a large-scale DocRE dataset show that the proposed model can significantly improve the accuracy of relation extraction on a strong heterogeneous graph-based baseline.
arXiv Detail & Related papers (2020-12-21T14:29:31Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - AutoETER: Automated Entity Type Representation for Knowledge Graph
Embedding [40.900070190077024]
We develop a novel Knowledge Graph Embedding (KGE) framework with Automated Entity TypE Representation (AutoETER)
Our approach could model and infer all the relation patterns and complex relations.
Experiments on four datasets demonstrate the superior performance of our model compared to state-of-the-art baselines on link prediction tasks.
arXiv Detail & Related papers (2020-09-25T04:27:35Z) - AutoRC: Improving BERT Based Relation Classification Models via
Architecture Search [50.349407334562045]
BERT based relation classification (RC) models have achieved significant improvements over the traditional deep learning models.
No consensus can be reached on what is the optimal architecture.
We design a comprehensive search space for BERT based RC models and employ neural architecture search (NAS) method to automatically discover the design choices.
arXiv Detail & Related papers (2020-09-22T16:55:49Z) - Interpretable Entity Representations through Large-Scale Typing [61.4277527871572]
We present an approach to creating entity representations that are human readable and achieve high performance out of the box.
Our representations are vectors whose values correspond to posterior probabilities over fine-grained entity types.
We show that it is possible to reduce the size of our type set in a learning-based way for particular domains.
arXiv Detail & Related papers (2020-04-30T23:58:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.