EntQA: Entity Linking as Question Answering
- URL: http://arxiv.org/abs/2110.02369v1
- Date: Tue, 5 Oct 2021 21:39:57 GMT
- Title: EntQA: Entity Linking as Question Answering
- Authors: Wenzheng Zhang, Wenyue Hua, Karl Stratos
- Abstract summary: We present EntQA, which stands for Entity linking as Question Answering.
Our approach combines progress in entity linking with that in open-domain question answering.
Unlike in previous works, we do not rely on a mention-candidates dictionary or large-scale weak supervision.
- Score: 18.39360849304263
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A conventional approach to entity linking is to first find mentions in a
given document and then infer their underlying entities in the knowledge base.
A well-known limitation of this approach is that it requires finding mentions
without knowing their entities, which is unnatural and difficult. We present a
new model that does not suffer from this limitation called EntQA, which stands
for Entity linking as Question Answering. EntQA first proposes candidate
entities with a fast retrieval module, and then scrutinizes the document to
find mentions of each candidate with a powerful reader module. Our approach
combines progress in entity linking with that in open-domain question answering
and capitalizes on pretrained models for dense entity retrieval and reading
comprehension. Unlike in previous works, we do not rely on a mention-candidates
dictionary or large-scale weak supervision. EntQA achieves strong results on
the GERBIL benchmarking platform.
Related papers
- Entity Disambiguation via Fusion Entity Decoding [68.77265315142296]
We propose an encoder-decoder model to disambiguate entities with more detailed entity descriptions.
We observe +1.5% improvements in end-to-end entity linking in the GERBIL benchmark compared with EntQA.
arXiv Detail & Related papers (2024-04-02T04:27:54Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Open-domain Question Answering via Chain of Reasoning over Heterogeneous
Knowledge [82.5582220249183]
We propose a novel open-domain question answering (ODQA) framework for answering single/multi-hop questions across heterogeneous knowledge sources.
Unlike previous methods that solely rely on the retriever for gathering all evidence in isolation, our intermediary performs a chain of reasoning over the retrieved set.
Our system achieves competitive performance on two ODQA datasets, OTT-QA and NQ, against tables and passages from Wikipedia.
arXiv Detail & Related papers (2022-10-22T03:21:32Z) - Improving Candidate Retrieval with Entity Profile Generation for
Wikidata Entity Linking [76.00737707718795]
We propose a novel candidate retrieval paradigm based on entity profiling.
We use the profile to query the indexed search engine to retrieve candidate entities.
Our approach complements the traditional approach of using a Wikipedia anchor-text dictionary.
arXiv Detail & Related papers (2022-02-27T17:38:53Z) - Autoregressive Entity Retrieval [55.38027440347138]
Entities are at the center of how we represent and aggregate knowledge.
The ability to retrieve such entities given a query is fundamental for knowledge-intensive tasks such as entity linking and open-domain question answering.
We propose GENRE, the first system that retrieves entities by generating their unique names, left to right, token-by-token in an autoregressive fashion.
arXiv Detail & Related papers (2020-10-02T10:13:31Z) - Tradeoffs in Sentence Selection Techniques for Open-Domain Question
Answering [54.541952928070344]
We describe two groups of models for sentence selection: QA-based approaches, which run a full-fledged QA system to identify answer candidates, and retrieval-based models, which find parts of each passage specifically related to each question.
We show that very lightweight QA models can do well at this task, but retrieval-based models are faster still.
arXiv Detail & Related papers (2020-09-18T23:39:15Z) - NeuralQA: A Usable Library for Question Answering (Contextual Query
Expansion + BERT) on Large Datasets [0.6091702876917281]
NeuralQA is a library for Question Answering (QA) on large datasets.
It integrates with existing infrastructure (e.g., ElasticSearch instances and reader models trained with the HuggingFace Transformers API) and offers helpful defaults for QA subtasks.
Code and documentation for NeuralQA is available as open source on Github.
arXiv Detail & Related papers (2020-07-30T03:38:30Z) - Message Passing Query Embedding [4.035753155957698]
We propose a graph neural network to encode a graph representation of a query.
We show that the model learns entity embeddings that capture the notion of entity type without explicit supervision.
arXiv Detail & Related papers (2020-02-06T17:40:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.