Trigger-GNN: A Trigger-Based Graph Neural Network for Nested Named
Entity Recognition
- URL: http://arxiv.org/abs/2204.05518v1
- Date: Tue, 12 Apr 2022 04:15:39 GMT
- Title: Trigger-GNN: A Trigger-Based Graph Neural Network for Nested Named
Entity Recognition
- Authors: Yuan Sui, Fanyang Bu, Yingting Hu, Wei Yan, and Liang Zhang
- Abstract summary: We propose a trigger-based graph neural network (Trigger-GNN) to leverage the nested NER.
It obtains the complementary annotation embeddings through entity trigger encoding and semantic matching.
It helps the model to learn and generalize more efficiently and cost-effectively.
- Score: 5.9049664765234295
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nested named entity recognition (NER) aims to identify the entity boundaries
and recognize categories of the named entities in a complex hierarchical
sentence. Some works have been done using character-level, word-level, or
lexicon-level based models. However, such researches ignore the role of the
complementary annotations. In this paper, we propose a trigger-based graph
neural network (Trigger-GNN) to leverage the nested NER. It obtains the
complementary annotation embeddings through entity trigger encoding and
semantic matching, and tackle nested entity utilizing an efficient graph
message passing architecture, aggregation-update mode. We posit that using
entity triggers as external annotations can add in complementary supervision
signals on the whole sentences. It helps the model to learn and generalize more
efficiently and cost-effectively. Experiments show that the Trigger-GNN
consistently outperforms the baselines on four public NER datasets, and it can
effectively alleviate the nested NER.
Related papers
- Composited-Nested-Learning with Data Augmentation for Nested Named Entity Recognition [5.188242370198818]
Nested Named Entity Recognition (NNER) focuses on addressing overlapped entity recognition.
Data augmentation is an effective approach to address the insufficient annotated corpus.
We propose Composited-Nested-Label Classification (CNLC) in which constituents are combined by nested-word and nested-label, to model nested entities.
arXiv Detail & Related papers (2024-06-18T16:46:18Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Gaussian Prior Reinforcement Learning for Nested Named Entity
Recognition [52.46740830977898]
We propose a novel seq2seq model named GPRL, which formulates the nested NER task as an entity triplet sequence generation process.
Experiments on three nested NER datasets demonstrate that GPRL outperforms previous nested NER models.
arXiv Detail & Related papers (2023-05-12T05:55:34Z) - Type-supervised sequence labeling based on the heterogeneous star graph
for named entity recognition [6.25916397918329]
The representation learning of the heterogeneous star graph containing text nodes and type nodes is investigated in this paper.
The model performs the type-supervised sequence labeling after updating nodes in the graph.
Experiments on public NER datasets reveal the effectiveness of our model in extracting both flat and nested entities.
arXiv Detail & Related papers (2022-10-19T01:40:06Z) - Optimizing Bi-Encoder for Named Entity Recognition via Contrastive
Learning [80.36076044023581]
We present an efficient bi-encoder framework for named entity recognition (NER)
We frame NER as a metric learning problem that maximizes the similarity between the vector representations of an entity mention and its type.
A major challenge to this bi-encoder formulation for NER lies in separating non-entity spans from entity mentions.
arXiv Detail & Related papers (2022-08-30T23:19:04Z) - An Embarrassingly Easy but Strong Baseline for Nested Named Entity
Recognition [55.080101447586635]
We propose using Conal Neural Network (CNN) to model spatial relations in the score matrix.
Our model surpasses several recently proposed methods with the same pre-trained encoders.
arXiv Detail & Related papers (2022-08-09T04:33:46Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - AutoTriggER: Label-Efficient and Robust Named Entity Recognition with
Auxiliary Trigger Extraction [54.20039200180071]
We present a novel framework to improve NER performance by automatically generating and leveraging entity triggers''
Our framework leverages post-hoc explanation to generate rationales and strengthens a model's prior knowledge using an embedding technique.
AutoTriggER shows strong label-efficiency, is capable of generalizing to unseen entities, and outperforms the RoBERTa-CRF baseline by nearly 0.5 F1 points on average.
arXiv Detail & Related papers (2021-09-10T08:11:56Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Named Entity Recognition without Labelled Data: A Weak Supervision
Approach [23.05371427663683]
This paper presents a simple but powerful approach to learn NER models in the absence of labelled data through weak supervision.
The approach relies on a broad spectrum of labelling functions to automatically annotate texts from the target domain.
A sequence labelling model can finally be trained on the basis of this unified annotation.
arXiv Detail & Related papers (2020-04-30T12:29:55Z) - TriggerNER: Learning with Entity Triggers as Explanations for Named
Entity Recognition [42.984048204280676]
We introduce "entity triggers," an effective proxy of human explanations for facilitating label-efficient learning of NER models.
We crowd-sourced 14k entity triggers for two well-studied NER datasets.
Our proposed model, Trigger Matching Network, jointly learns trigger representations and soft matching module with self-attention.
arXiv Detail & Related papers (2020-04-16T07:27:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.