CYCLE: Cross-Year Contrastive Learning in Entity-Linking
- URL: http://arxiv.org/abs/2410.09127v1
- Date: Fri, 11 Oct 2024 09:41:54 GMT
- Title: CYCLE: Cross-Year Contrastive Learning in Entity-Linking
- Authors: Pengyu Zhang, Congfeng Cao, Klim Zaporojets, Paul Groth,
- Abstract summary: textbfCYCLE: textbfCross-textbfYear textbfContrastive textbfLearning for textbfEntity-Linking.
We introduce textbfCYCLE: textbfCross-textbfYear textbfContrastive textbfLearning for textbfEntity-Linking.
- Score: 8.108904258003411
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graphs constantly evolve with new entities emerging, existing definitions being revised, and entity relationships changing. These changes lead to temporal degradation in entity linking models, characterized as a decline in model performance over time. To address this issue, we propose leveraging graph relationships to aggregate information from neighboring entities across different time periods. This approach enhances the ability to distinguish similar entities over time, thereby minimizing the impact of temporal degradation. We introduce \textbf{CYCLE}: \textbf{C}ross-\textbf{Y}ear \textbf{C}ontrastive \textbf{L}earning for \textbf{E}ntity-Linking. This model employs a novel graph contrastive learning method to tackle temporal performance degradation in entity linking tasks. Our contrastive learning method treats newly added graph relationships as \textit{positive} samples and newly removed ones as \textit{negative} samples. This approach helps our model effectively prevent temporal degradation, achieving a 13.90\% performance improvement over the state-of-the-art from 2023 when the time gap is one year, and a 17.79\% improvement as the gap expands to three years. Further analysis shows that CYCLE is particularly robust for low-degree entities, which are less resistant to temporal degradation due to their sparse connectivity, making them particularly suitable for our method. The code and data are made available at \url{https://github.com/pengyu-zhang/CYCLE-Cross-Year-Contrastive-Learning-in-Entity-Linking}.
Related papers
- TIGER: Temporally Improved Graph Entity Linker [6.111040278075022]
textbfTIGER: a textbfTemporally textbfImproved textbfGraph textbfEntity Linketextbfr.
We introduce textbfTIGER: a textbfTemporally textbfImproved textbfGraph textbfEntity Linketextbfr.
We enhance the learned representation, making entities
arXiv Detail & Related papers (2024-10-11T09:44:33Z) - Contrastive Learning Is Not Optimal for Quasiperiodic Time Series [4.2807943283312095]
We introduce Distilled Embedding for Almost-Periodic Time Series (DEAPS) in this paper.
DEAPS is a non-contrastive method tailored for quasiperiodic time series, such as electrocardiogram (ECG) data.
We demonstrate a notable improvement of +10% over existing SOTA methods when just a few annotated records are presented to fit a Machine Learning (ML) model.
arXiv Detail & Related papers (2024-07-24T08:02:41Z) - From Link Prediction to Forecasting: Information Loss in Batch-based Temporal Graph Learning [0.716879432974126]
We show that the suitability of common batch-oriented evaluation depends on the datasets' characteristics.
We reformulate dynamic link prediction as a link forecasting task that better accounts for temporal information present in the data.
arXiv Detail & Related papers (2024-06-07T12:45:12Z) - Relation Rectification in Diffusion Model [64.84686527988809]
We introduce a novel task termed Relation Rectification, aiming to refine the model to accurately represent a given relationship it initially fails to generate.
We propose an innovative solution utilizing a Heterogeneous Graph Convolutional Network (HGCN)
The lightweight HGCN adjusts the text embeddings generated by the text encoder, ensuring the accurate reflection of the textual relation in the embedding space.
arXiv Detail & Related papers (2024-03-29T15:54:36Z) - TempEL: Linking Dynamically Evolving and Newly Emerging Entities [50.980331847622026]
In our continuously evolving world, entities change over time and new, previously non-existing or unknown, entities appear.
We study how this evolutionary scenario impacts the performance on a well established entity linking (EL) task.
We introduce TempEL, an entity linking dataset that consists of time-stratified English Wikipedia snapshots from 2013 to 2022.
arXiv Detail & Related papers (2023-02-05T22:34:36Z) - Learning of Visual Relations: The Devil is in the Tails [59.737494875502215]
Visual relation learning is a long-tailed problem, due to the nature of joint reasoning about groups of objects.
In this paper, we explore an alternative hypothesis, denoted the Devil is in the Tails.
Under this hypothesis, better performance is achieved by keeping the model simple but improving its ability to cope with long-tailed distributions.
arXiv Detail & Related papers (2021-08-22T08:59:35Z) - Causal Incremental Graph Convolution for Recommender System Retraining [89.25922726558875]
Real-world recommender system needs to be regularly retrained to keep with the new data.
In this work, we consider how to efficiently retrain graph convolution network (GCN) based recommender models.
arXiv Detail & Related papers (2021-08-16T04:20:09Z) - Dynamic Hybrid Relation Network for Cross-Domain Context-Dependent
Semantic Parsing [52.24507547010127]
Cross-domain context-dependent semantic parsing is a new focus of research.
We present a dynamic graph framework that effectively modelling contextual utterances, tokens, database schemas, and their complicated interaction as the conversation proceeds.
The proposed framework outperforms all existing models by large margins, achieving new state-of-the-art performance on two large-scale benchmarks.
arXiv Detail & Related papers (2021-01-05T18:11:29Z) - One-shot Learning for Temporal Knowledge Graphs [49.41854171118697]
We propose a one-shot learning framework for link prediction in temporal knowledge graphs.
Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities.
Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks.
arXiv Detail & Related papers (2020-10-23T03:24:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.