Effective Token Graph Modeling using a Novel Labeling Strategy for
Structured Sentiment Analysis
- URL: http://arxiv.org/abs/2203.10796v1
- Date: Mon, 21 Mar 2022 08:23:03 GMT
- Title: Effective Token Graph Modeling using a Novel Labeling Strategy for
Structured Sentiment Analysis
- Authors: Wenxuan Shi, Fei Li, Jingye Li, Hao Fei, Donghong Ji
- Abstract summary: State-of-the-art model for structured sentiment analysis casts the task as a dependency parsing problem.
Label proportions for span prediction and span relation prediction are imbalanced.
Two nodes in a dependency graph cannot have multiple arcs, therefore some overlapped sentiments cannot be recognized.
- Score: 39.770652220521384
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The state-of-the-art model for structured sentiment analysis casts the task
as a dependency parsing problem, which has some limitations: (1) The label
proportions for span prediction and span relation prediction are imbalanced.
(2) The span lengths of sentiment tuple components may be very large in this
task, which will further exacerbate the imbalance problem. (3) Two nodes in a
dependency graph cannot have multiple arcs, therefore some overlapped sentiment
tuples cannot be recognized. In this work, we propose nichetargeting solutions
for these issues. First, we introduce a novel labeling strategy, which contains
two sets of token pair labels, namely essential label set and whole label set.
The essential label set consists of the basic labels for this task, which are
relatively balanced and applied in the prediction layer. The whole label set
includes rich labels to help our model capture various token relations, which
are applied in the hidden layer to softly influence our model. Moreover, we
also propose an effective model to well collaborate with our labeling strategy,
which is equipped with the graph attention networks to iteratively refine token
representations, and the adaptive multi-label classifier to dynamically predict
multiple relations between token pairs. We perform extensive experiments on 5
benchmark datasets in four languages. Experimental results show that our model
outperforms previous SOTA models by a large margin.
Related papers
- Label Dependencies-aware Set Prediction Networks for Multi-label Text Classification [0.0]
We leverage Graph Convolutional Networks and construct an adjacency matrix based on the statistical relations between labels.
We enhance recall ability by applying the Bhattacharyya distance to the output distributions of the set prediction networks.
arXiv Detail & Related papers (2023-04-14T09:31:17Z) - Bridging the Gap between Model Explanations in Partially Annotated
Multi-label Classification [85.76130799062379]
We study how false negative labels affect the model's explanation.
We propose to boost the attribution scores of the model trained with partial labels to make its explanation resemble that of the model trained with full labels.
arXiv Detail & Related papers (2023-04-04T14:00:59Z) - Group is better than individual: Exploiting Label Topologies and Label
Relations for Joint Multiple Intent Detection and Slot Filling [39.76268402567324]
We construct a Heterogeneous Label Graph (HLG) containing two kinds of topologies.
Label correlations are leveraged to enhance semantic-label interactions.
We also propose the label-aware inter-dependent decoding mechanism to further exploit the label correlations for decoding.
arXiv Detail & Related papers (2022-10-19T08:21:43Z) - Graph Attention Transformer Network for Multi-Label Image Classification [50.0297353509294]
We propose a general framework for multi-label image classification that can effectively mine complex inter-label relationships.
Our proposed methods can achieve state-of-the-art performance on three datasets.
arXiv Detail & Related papers (2022-03-08T12:39:05Z) - Why Propagate Alone? Parallel Use of Labels and Features on Graphs [42.01561812621306]
Graph neural networks (GNNs) and label propagation represent two interrelated modeling strategies designed to exploit graph structure in tasks such as node property prediction.
We show that a label trick can be reduced to an interpretable, deterministic training objective composed of two factors.
arXiv Detail & Related papers (2021-10-14T07:34:11Z) - Pack Together: Entity and Relation Extraction with Levitated Marker [61.232174424421025]
We propose a novel span representation approach, named Packed Levitated Markers, to consider the dependencies between the spans (pairs) by strategically packing the markers in the encoder.
Our experiments show that our model with packed levitated markers outperforms the sequence labeling model by 0.4%-1.9% F1 on three flat NER tasks, and beats the token concat model on six NER benchmarks.
arXiv Detail & Related papers (2021-09-13T15:38:13Z) - GNN-XML: Graph Neural Networks for Extreme Multi-label Text
Classification [23.79498916023468]
Extreme multi-label text classification (XMTC) aims to tag a text instance with the most relevant subset of labels from an extremely large label set.
GNN-XML is a scalable graph neural network framework tailored for XMTC problems.
arXiv Detail & Related papers (2020-12-10T18:18:34Z) - A Study on the Autoregressive and non-Autoregressive Multi-label
Learning [77.11075863067131]
We propose a self-attention based variational encoder-model to extract the label-label and label-feature dependencies jointly.
Our model can therefore be used to predict all labels in parallel while still including both label-label and label-feature dependencies.
arXiv Detail & Related papers (2020-12-03T05:41:44Z) - Few-shot Slot Tagging with Collapsed Dependency Transfer and
Label-enhanced Task-adaptive Projection Network [61.94394163309688]
We propose a Label-enhanced Task-Adaptive Projection Network (L-TapNet) based on the state-of-the-art few-shot classification model -- TapNet.
Experimental results show that our model significantly outperforms the strongest few-shot learning baseline by 14.64 F1 scores in the one-shot setting.
arXiv Detail & Related papers (2020-06-10T07:50:44Z) - Multi-Label Text Classification using Attention-based Graph Neural
Network [0.0]
A graph attention network-based model is proposed to capture the attentive dependency structure among the labels.
The proposed model achieves similar or better performance compared to the previous state-of-the-art models.
arXiv Detail & Related papers (2020-03-22T17:12:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.