Polar Ducks and Where to Find Them: Enhancing Entity Linking with Duck
Typing and Polar Box Embeddings
- URL: http://arxiv.org/abs/2305.12027v2
- Date: Fri, 20 Oct 2023 13:58:55 GMT
- Title: Polar Ducks and Where to Find Them: Enhancing Entity Linking with Duck
Typing and Polar Box Embeddings
- Authors: Mattia Atzeni, Mikhail Plekhanov, Fr\'ed\'eric A. Dreyer, Nora
Kassner, Simone Merello, Louis Martin, Nicola Cancedda
- Abstract summary: DUCK is an approach to infusing structural information in the space of entity representations.
We define the type of an entity based on the relations that it has with other entities in a knowledge graph.
We optimize the model to cluster entities of similar type by placing them inside the boxes corresponding to their relations.
- Score: 11.164501311947124
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Entity linking methods based on dense retrieval are an efficient and widely
used solution in large-scale applications, but they fall short of the
performance of generative models, as they are sensitive to the structure of the
embedding space. In order to address this issue, this paper introduces DUCK, an
approach to infusing structural information in the space of entity
representations, using prior knowledge of entity types. Inspired by duck typing
in programming languages, we propose to define the type of an entity based on
the relations that it has with other entities in a knowledge graph. Then,
porting the concept of box embeddings to spherical polar coordinates, we
propose to represent relations as boxes on the hypersphere. We optimize the
model to cluster entities of similar type by placing them inside the boxes
corresponding to their relations. Our experiments show that our method sets new
state-of-the-art results on standard entity-disambiguation benchmarks, it
improves the performance of the model by up to 7.9 F1 points, outperforms other
type-aware approaches, and matches the results of generative models with 18
times more parameters.
Related papers
- Entity Disambiguation via Fusion Entity Decoding [68.77265315142296]
We propose an encoder-decoder model to disambiguate entities with more detailed entity descriptions.
We observe +1.5% improvements in end-to-end entity linking in the GERBIL benchmark compared with EntQA.
arXiv Detail & Related papers (2024-04-02T04:27:54Z) - Coherent Entity Disambiguation via Modeling Topic and Categorical
Dependency [87.16283281290053]
Previous entity disambiguation (ED) methods adopt a discriminative paradigm, where prediction is made based on matching scores between mention context and candidate entities.
We propose CoherentED, an ED system equipped with novel designs aimed at enhancing the coherence of entity predictions.
We achieve new state-of-the-art results on popular ED benchmarks, with an average improvement of 1.3 F1 points.
arXiv Detail & Related papers (2023-11-06T16:40:13Z) - AsyncET: Asynchronous Learning for Knowledge Graph Entity Typing with
Auxiliary Relations [42.16033541753744]
We improve the expressiveness of knowledge graph embedding (KGE) methods by introducing multiple auxiliary relations.
Similar entity types are grouped to reduce the number of auxiliary relations and improve their capability to model entity-type patterns with different granularities.
Experiments are conducted on two commonly used KGET datasets to show that the performance of KGE methods on the KGET task can be substantially improved.
arXiv Detail & Related papers (2023-08-30T14:24:16Z) - ConE: Cone Embeddings for Multi-Hop Reasoning over Knowledge Graphs [73.86041481470261]
Cone Embeddings (ConE) is the first geometry-based query embedding model that can handle conjunction, disjunction, and negation.
ConE significantly outperforms existing state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2021-10-26T14:04:02Z) - Entity Linking and Discovery via Arborescence-based Supervised
Clustering [35.93568319872986]
We present novel training and inference procedures that fully utilize mention-to-mention affinities.
We show that this method gracefully extends to entity discovery.
We evaluate our approach on the Zero-Shot Entity Linking dataset and MedMentions, the largest publicly available biomedical dataset.
arXiv Detail & Related papers (2021-09-02T23:05:58Z) - Modeling Fine-Grained Entity Types with Box Embeddings [32.85605894725522]
We study the ability of box embeddings to represent hierarchies of fine-grained entity type labels.
We compare our approach with a strong vector-based typing model, and observe state-of-the-art performance on several entity typing benchmarks.
arXiv Detail & Related papers (2021-01-02T00:59:10Z) - Exploring and Evaluating Attributes, Values, and Structures for Entity
Alignment [100.19568734815732]
Entity alignment (EA) aims at building a unified Knowledge Graph (KG) of rich content by linking the equivalent entities from various KGs.
attribute triples can also provide crucial alignment signal but have not been well explored yet.
We propose to utilize an attributed value encoder and partition the KG into subgraphs to model the various types of attribute triples efficiently.
arXiv Detail & Related papers (2020-10-07T08:03:58Z) - AutoRC: Improving BERT Based Relation Classification Models via
Architecture Search [50.349407334562045]
BERT based relation classification (RC) models have achieved significant improvements over the traditional deep learning models.
No consensus can be reached on what is the optimal architecture.
We design a comprehensive search space for BERT based RC models and employ neural architecture search (NAS) method to automatically discover the design choices.
arXiv Detail & Related papers (2020-09-22T16:55:49Z) - Interpretable Entity Representations through Large-Scale Typing [61.4277527871572]
We present an approach to creating entity representations that are human readable and achieve high performance out of the box.
Our representations are vectors whose values correspond to posterior probabilities over fine-grained entity types.
We show that it is possible to reduce the size of our type set in a learning-based way for particular domains.
arXiv Detail & Related papers (2020-04-30T23:58:03Z) - Improving Entity Linking by Modeling Latent Entity Type Information [25.33342677359822]
We propose to inject latent entity type information into the entity embeddings based on pre-trained BERT.
In addition, we integrate a BERT-based entity similarity score into the local context model of a state-of-the-art model to better capture latent entity type information.
Our model significantly outperforms the state-of-the-art entity linking models on standard benchmark (AIDA-CoNLL)
arXiv Detail & Related papers (2020-01-06T09:18:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.