Cross-Domain Neural Entity Linking
- URL: http://arxiv.org/abs/2210.15616v1
- Date: Wed, 28 Sep 2022 15:22:31 GMT
- Title: Cross-Domain Neural Entity Linking
- Authors: Hassan Soliman
- Abstract summary: We propose a Cross-Domain Neural Entity Linking framework (CDNEL)
Our objective is to have a single system that enables simultaneous linking to both the generaldomain KB and the domain-specific KB.
The proposed framework uses different types of datasets for fine-tuning, resulting in different model variants of CDNEL.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Entity Linking is the task of matching a mention to an entity in a given
knowledge base (KB). It contributes to annotating a massive amount of documents
existing on the Web to harness new facts about their matched entities. However,
existing Entity Linking systems focus on developing models that are typically
domain-dependent and robust only to a particular knowledge base on which they
have been trained. The performance is not as adequate when being evaluated on
documents and knowledge bases from different domains.
Approaches based on pre-trained language models, such as Wu et al. (2020),
attempt to solve the problem using a zero-shot setup, illustrating some
potential when evaluated on a general-domain KB. Nevertheless, the performance
is not equivalent when evaluated on a domain-specific KB. To allow for more
accurate Entity Linking across different domains, we propose our framework:
Cross-Domain Neural Entity Linking (CDNEL). Our objective is to have a single
system that enables simultaneous linking to both the general-domain KB and the
domain-specific KB. CDNEL works by learning a joint representation space for
these knowledge bases from different domains. It is evaluated using the
external Entity Linking dataset (Zeshel) constructed by Logeswaran et al.
(2019) and the Reddit dataset collected by Botzer et al. (2021), to compare our
proposed method with the state-of-the-art results. The proposed framework uses
different types of datasets for fine-tuning, resulting in different model
variants of CDNEL. When evaluated on four domains included in the Zeshel
dataset, these variants achieve an average precision gain of 9%.
Related papers
- Heterogeneous Graph-based Framework with Disentangled Representations Learning for Multi-target Cross Domain Recommendation [7.247438542823219]
CDR (Cross-Domain Recommendation) is a critical solution to data sparsity problem in recommendation system.
We present HGDR (Heterogeneous Graph-based Framework with Disentangled Representations Learning), an end-to-end heterogeneous network architecture.
Experiments on real-world datasets and online A/B tests prove that our proposed model can transmit information among domains effectively.
arXiv Detail & Related papers (2024-07-01T02:27:54Z) - TAL: Two-stream Adaptive Learning for Generalizable Person
Re-identification [115.31432027711202]
We argue that both domain-specific and domain-invariant features are crucial for improving the generalization ability of re-id models.
We name two-stream adaptive learning (TAL) to simultaneously model these two kinds of information.
Our framework can be applied to both single-source and multi-source domain generalization tasks.
arXiv Detail & Related papers (2021-11-29T01:27:42Z) - Context-Conditional Adaptation for Recognizing Unseen Classes in Unseen
Domains [48.17225008334873]
We propose a feature generative framework integrated with a COntext COnditional Adaptive (COCOA) Batch-Normalization.
The generated visual features better capture the underlying data distribution enabling us to generalize to unseen classes and domains at test-time.
We thoroughly evaluate and analyse our approach on established large-scale benchmark - DomainNet.
arXiv Detail & Related papers (2021-07-15T17:51:16Z) - Cross-domain Contrastive Learning for Unsupervised Domain Adaptation [108.63914324182984]
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a fully-labeled source domain to a different unlabeled target domain.
We build upon contrastive self-supervised learning to align features so as to reduce the domain discrepancy between training and testing sets.
arXiv Detail & Related papers (2021-06-10T06:32:30Z) - Inferring Latent Domains for Unsupervised Deep Domain Adaptation [54.963823285456925]
Unsupervised Domain Adaptation (UDA) refers to the problem of learning a model in a target domain where labeled data are not available.
This paper introduces a novel deep architecture which addresses the problem of UDA by automatically discovering latent domains in visual datasets.
We evaluate our approach on publicly available benchmarks, showing that it outperforms state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2021-03-25T14:33:33Z) - Learning to Cluster under Domain Shift [20.00056591000625]
In this work we address the problem of transferring knowledge from a source to a target domain when both source and target data have no annotations.
Inspired by recent works on deep clustering, our approach leverages information from data gathered from multiple source domains.
We show that our method is able to automatically discover relevant semantic information even in presence of few target samples.
arXiv Detail & Related papers (2020-08-11T12:03:01Z) - Self-paced Contrastive Learning with Hybrid Memory for Domain Adaptive
Object Re-ID [55.21702895051287]
Domain adaptive object re-ID aims to transfer the learned knowledge from the labeled source domain to the unlabeled target domain.
We propose a novel self-paced contrastive learning framework with hybrid memory.
Our method outperforms state-of-the-arts on multiple domain adaptation tasks of object re-ID.
arXiv Detail & Related papers (2020-06-04T09:12:44Z) - Zero-Resource Cross-Domain Named Entity Recognition [68.83177074227598]
Existing models for cross-domain named entity recognition rely on numerous unlabeled corpus or labeled NER training data in target domains.
We propose a cross-domain NER model that does not use any external resources.
arXiv Detail & Related papers (2020-02-14T09:04:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.