KoGNER: A Novel Framework for Knowledge Graph Distillation on Biomedical Named Entity Recognition
- URL: http://arxiv.org/abs/2503.15737v1
- Date: Wed, 19 Mar 2025 22:59:36 GMT
- Title: KoGNER: A Novel Framework for Knowledge Graph Distillation on Biomedical Named Entity Recognition
- Authors: Heming Zhang, Wenyu Li, Di Huang, Yinjie Tang, Yixin Chen, Philip Payne, Fuhai Li,
- Abstract summary: Named Entity Recognition (NER) plays a crucial role in information extraction, question answering, and knowledge-based systems.<n>Traditional deep learning-based NER models often struggle with domain-specific generalization and suffer from data sparsity issues.<n>We introduce Knowledge Graph distilled for Named Entity Recognition (KoGNER), a novel approach that integrates Knowledge Graph into NER models.
- Score: 19.311500689293336
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Named Entity Recognition (NER) is a fundamental task in Natural Language Processing (NLP) that plays a crucial role in information extraction, question answering, and knowledge-based systems. Traditional deep learning-based NER models often struggle with domain-specific generalization and suffer from data sparsity issues. In this work, we introduce Knowledge Graph distilled for Named Entity Recognition (KoGNER), a novel approach that integrates Knowledge Graph (KG) distillation into NER models to enhance entity recognition performance. Our framework leverages structured knowledge representations from KGs to enrich contextual embeddings, thereby improving entity classification and reducing ambiguity in entity detection. KoGNER employs a two-step process: (1) Knowledge Distillation, where external knowledge sources are distilled into a lightweight representation for seamless integration with NER models, and (2) Entity-Aware Augmentation, which integrates contextual embeddings that have been enriched with knowledge graph information directly into GNN, thereby improving the model's ability to understand and represent entity relationships. Experimental results on benchmark datasets demonstrate that KoGNER achieves state-of-the-art performance, outperforming finetuned NER models and LLMs by a significant margin. These findings suggest that leveraging knowledge graphs as auxiliary information can significantly improve NER accuracy, making KoGNER a promising direction for future research in knowledge-aware NLP.
Related papers
- Knowledge Graphs and Pre-trained Language Models enhanced Representation Learning for Conversational Recommender Systems [58.561904356651276]
We introduce the Knowledge-Enhanced Entity Representation Learning (KERL) framework to improve the semantic understanding of entities for Conversational recommender systems.
KERL uses a knowledge graph and a pre-trained language model to improve the semantic understanding of entities.
KERL achieves state-of-the-art results in both recommendation and response generation tasks.
arXiv Detail & Related papers (2023-12-18T06:41:23Z) - Beyond Factuality: A Comprehensive Evaluation of Large Language Models
as Knowledge Generators [78.63553017938911]
Large language models (LLMs) outperform information retrieval techniques for downstream knowledge-intensive tasks.
However, community concerns abound regarding the factuality and potential implications of using this uncensored knowledge.
We introduce CONNER, designed to evaluate generated knowledge from six important perspectives.
arXiv Detail & Related papers (2023-10-11T08:22:37Z) - Recognizing Unseen Objects via Multimodal Intensive Knowledge Graph
Propagation [68.13453771001522]
We propose a multimodal intensive ZSL framework that matches regions of images with corresponding semantic embeddings.
We conduct extensive experiments and evaluate our model on large-scale real-world data.
arXiv Detail & Related papers (2023-06-14T13:07:48Z) - Prompting ChatGPT in MNER: Enhanced Multimodal Named Entity Recognition
with Auxiliary Refined Knowledge [27.152813529536424]
We present PGIM -- a two-stage framework that aims to leverage ChatGPT as an implicit knowledge base.
PGIM generates auxiliary knowledge for more efficient entity prediction.
It outperforms state-of-the-art methods on two classic MNER datasets.
arXiv Detail & Related papers (2023-05-20T15:24:38Z) - UNTER: A Unified Knowledge Interface for Enhancing Pre-trained Language
Models [100.4659557650775]
We propose a UNified knowledge inTERface, UNTER, to provide a unified perspective to exploit both structured knowledge and unstructured knowledge.
With both forms of knowledge injected, UNTER gains continuous improvements on a series of knowledge-driven NLP tasks.
arXiv Detail & Related papers (2023-05-02T17:33:28Z) - KGNN: Distributed Framework for Graph Neural Knowledge Representation [38.080926752998586]
We develop a novel framework called KGNN to take full advantage of knowledge data for representation learning in the distributed learning system.
KGNN is equipped with GNN based encoder and knowledge aware decoder, which aim to jointly explore high-order structure and attribute information together.
arXiv Detail & Related papers (2022-05-17T12:32:02Z) - MINER: Improving Out-of-Vocabulary Named Entity Recognition from an
Information Theoretic Perspective [57.19660234992812]
NER model has achieved promising performance on standard NER benchmarks.
Recent studies show that previous approaches may over-rely on entity mention information, resulting in poor performance on out-of-vocabulary (OOV) entity recognition.
We propose MINER, a novel NER learning framework, to remedy this issue from an information-theoretic perspective.
arXiv Detail & Related papers (2022-04-09T05:18:20Z) - Jointly Learning Knowledge Embedding and Neighborhood Consensus with
Relational Knowledge Distillation for Entity Alignment [9.701081498310165]
Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs.
Recent studies employ embedding-based methods by first learning representation of Knowledge Graphs and then performing entity alignment.
We propose a Graph Convolutional Network (GCN) model equipped with knowledge distillation for entity alignment.
arXiv Detail & Related papers (2022-01-25T02:47:14Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - KARL-Trans-NER: Knowledge Aware Representation Learning for Named Entity
Recognition using Transformers [0.0]
We propose a Knowledge Aware Representation Learning (KARL) Network for Named Entity Recognition (NER)
KARL is based on a Transformer that utilizes large knowledge bases represented as fact triplets, converts them to a context, and extracts essential information residing inside to generate contextualized triplet representation for feature augmentation.
Experimental results show that the augmentation done using KARL can considerably boost the performance of our NER system and achieve significantly better results than existing approaches in the literature on three publicly available NER datasets, namely CoNLL 2003, CoNLL++, and OntoNotes v5.
arXiv Detail & Related papers (2021-11-30T14:29:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.