Improving Named Entity Recognition with Attentive Ensemble of Syntactic
Information
- URL: http://arxiv.org/abs/2010.15466v1
- Date: Thu, 29 Oct 2020 10:25:17 GMT
- Title: Improving Named Entity Recognition with Attentive Ensemble of Syntactic
Information
- Authors: Yuyang Nie, Yuanhe Tian, Yan Song, Xiang Ao, and Xiang Wan
- Abstract summary: Named entity recognition (NER) is highly sensitive to sentential syntactic and semantic properties.
In this paper, we improve NER by leveraging different types of syntactic information through attentive ensemble.
Experimental results on six English and Chinese benchmark datasets suggest the effectiveness of the proposed model.
- Score: 36.03316058182617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Named entity recognition (NER) is highly sensitive to sentential syntactic
and semantic properties where entities may be extracted according to how they
are used and placed in the running text. To model such properties, one could
rely on existing resources to providing helpful knowledge to the NER task; some
existing studies proved the effectiveness of doing so, and yet are limited in
appropriately leveraging the knowledge such as distinguishing the important
ones for particular context. In this paper, we improve NER by leveraging
different types of syntactic information through attentive ensemble, which
functionalizes by the proposed key-value memory networks, syntax attention, and
the gate mechanism for encoding, weighting and aggregating such syntactic
information, respectively. Experimental results on six English and Chinese
benchmark datasets suggest the effectiveness of the proposed model and show
that it outperforms previous studies on all experiment datasets.
Related papers
- Syntax-Informed Interactive Model for Comprehensive Aspect-Based
Sentiment Analysis [0.0]
We introduce an innovative model: Syntactic Dependency Enhanced Multi-Task Interaction Architecture (SDEMTIA) for comprehensive ABSA.
Our approach innovatively exploits syntactic knowledge (dependency relations and types) using a specialized Syntactic Dependency Embedded Interactive Network (SDEIN)
We also incorporate a novel and efficient message-passing mechanism within a multi-task learning framework to bolster learning efficacy.
arXiv Detail & Related papers (2023-11-28T16:03:22Z) - An Empirical Investigation of Commonsense Self-Supervision with
Knowledge Graphs [67.23285413610243]
Self-supervision based on the information extracted from large knowledge graphs has been shown to improve the generalization of language models.
We study the effect of knowledge sampling strategies and sizes that can be used to generate synthetic data for adapting language models.
arXiv Detail & Related papers (2022-05-21T19:49:04Z) - Nested Named Entity Recognition as Holistic Structure Parsing [92.8397338250383]
This work models the full nested NEs in a sentence as a holistic structure, then we propose a holistic structure parsing algorithm to disclose the entire NEs once for all.
Experiments show that our model yields promising results on widely-used benchmarks which approach or even achieve state-of-the-art.
arXiv Detail & Related papers (2022-04-17T12:48:20Z) - MINER: Improving Out-of-Vocabulary Named Entity Recognition from an
Information Theoretic Perspective [57.19660234992812]
NER model has achieved promising performance on standard NER benchmarks.
Recent studies show that previous approaches may over-rely on entity mention information, resulting in poor performance on out-of-vocabulary (OOV) entity recognition.
We propose MINER, a novel NER learning framework, to remedy this issue from an information-theoretic perspective.
arXiv Detail & Related papers (2022-04-09T05:18:20Z) - KARL-Trans-NER: Knowledge Aware Representation Learning for Named Entity
Recognition using Transformers [0.0]
We propose a Knowledge Aware Representation Learning (KARL) Network for Named Entity Recognition (NER)
KARL is based on a Transformer that utilizes large knowledge bases represented as fact triplets, converts them to a context, and extracts essential information residing inside to generate contextualized triplet representation for feature augmentation.
Experimental results show that the augmentation done using KARL can considerably boost the performance of our NER system and achieve significantly better results than existing approaches in the literature on three publicly available NER datasets, namely CoNLL 2003, CoNLL++, and OntoNotes v5.
arXiv Detail & Related papers (2021-11-30T14:29:33Z) - Empirical Study of Named Entity Recognition Performance Using
Distribution-aware Word Embedding [15.955385058787348]
We develop a distribution-aware word embedding and implement three different methods to make use of the distribution information in a NER framework.
The performance of NER will be improved if the word specificity is incorporated into existing NER methods.
arXiv Detail & Related papers (2021-09-03T17:28:04Z) - Probing Linguistic Features of Sentence-Level Representations in Neural
Relation Extraction [80.38130122127882]
We introduce 14 probing tasks targeting linguistic properties relevant to neural relation extraction (RE)
We use them to study representations learned by more than 40 different encoder architecture and linguistic feature combinations trained on two datasets.
We find that the bias induced by the architecture and the inclusion of linguistic features are clearly expressed in the probing task performance.
arXiv Detail & Related papers (2020-04-17T09:17:40Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.