Neural Transition System for End-to-End Opinion Role Labeling
- URL: http://arxiv.org/abs/2110.02001v1
- Date: Tue, 5 Oct 2021 12:45:59 GMT
- Title: Neural Transition System for End-to-End Opinion Role Labeling
- Authors: Shengqiong Wu and Donghong Ji
- Abstract summary: Unified opinion role labeling (ORL) aims to detect all possible opinion structures of opinion-holder-target' in one shot, given a text.
We propose a novel solution by revisiting the transition architecture, and augment it with a pointer network (PointNet)
The framework parses out all opinion structures in linear-time complexity, breaks through the limitation of any length of terms with PointNet.
- Score: 13.444895891262844
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unified opinion role labeling (ORL) aims to detect all possible opinion
structures of `opinion-holder-target' in one shot, given a text. The existing
transition-based unified method, unfortunately, is subject to longer opinion
terms and fails to solve the term overlap issue. Current top performance has
been achieved by employing the span-based graph model, which however still
suffers from both high model complexity and insufficient interaction among
opinions and roles. In this work, we investigate a novel solution by revisiting
the transition architecture, and augment it with a pointer network (PointNet).
The framework parses out all opinion structures in linear-time complexity,
meanwhile breaks through the limitation of any length of terms with PointNet.
To achieve the explicit opinion-role interactions, we further propose a unified
dependency-opinion graph (UDOG), co-modeling the syntactic dependency structure
and the partial opinion-role structure. We then devise a relation-centered
graph aggregator (RCGA) to encode the multi-relational UDOG, where the
resulting high-order representations are used to promote the predictions in the
vanilla transition system. Our model achieves new state-of-the-art results on
the MPQA benchmark. Analyses further demonstrate the superiority of our methods
on both efficacy and efficiency.
Related papers
- A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Adaptive Message Passing: A General Framework to Mitigate Oversmoothing, Oversquashing, and Underreaching [23.487431014596556]
Long-range interactions are essential for the correct description of complex systems in many scientific fields.
Most deep graph networks cannot really model long-range dependencies due to intrinsic limitations of (synchronous) message passing.
This work proposes a general framework that learns to mitigate these limitations.
arXiv Detail & Related papers (2023-12-27T12:49:27Z) - Spatial-Temporal Graph Enhanced DETR Towards Multi-Frame 3D Object Detection [54.041049052843604]
We present STEMD, a novel end-to-end framework that enhances the DETR-like paradigm for multi-frame 3D object detection.
First, to model the inter-object spatial interaction and complex temporal dependencies, we introduce the spatial-temporal graph attention network.
Finally, it poses a challenge for the network to distinguish between the positive query and other highly similar queries that are not the best match.
arXiv Detail & Related papers (2023-07-01T13:53:14Z) - Structured Sentiment Analysis as Transition-based Dependency Parsing [0.40611352512781856]
Structured sentiment analysis aims to automatically extract people's opinions from a text in natural language.
One of the most accurate methods for performing SSA was recently proposed and consists of approaching it as a dependency parsing task.
We present the first transition-based method to address SSA as dependency parsing.
arXiv Detail & Related papers (2023-05-09T10:03:34Z) - Semantics-Aware Dynamic Localization and Refinement for Referring Image
Segmentation [102.25240608024063]
Referring image segments an image from a language expression.
We develop an algorithm that shifts from being localization-centric to segmentation-language.
Compared to its counterparts, our method is more versatile yet effective.
arXiv Detail & Related papers (2023-03-11T08:42:40Z) - Understanding and Constructing Latent Modality Structures in Multi-modal
Representation Learning [53.68371566336254]
We argue that the key to better performance lies in meaningful latent modality structures instead of perfect modality alignment.
Specifically, we design 1) a deep feature separation loss for intra-modality regularization; 2) a Brownian-bridge loss for inter-modality regularization; and 3) a geometric consistency loss for both intra- and inter-modality regularization.
arXiv Detail & Related papers (2023-03-10T14:38:49Z) - DepGraph: Towards Any Structural Pruning [68.40343338847664]
We study general structural pruning of arbitrary architecture like CNNs, RNNs, GNNs and Transformers.
We propose a general and fully automatic method, emphDependency Graph (DepGraph), to explicitly model the dependency between layers and comprehensively group parameters for pruning.
In this work, we extensively evaluate our method on several architectures and tasks, including ResNe(X)t, DenseNet, MobileNet and Vision transformer for images, GAT for graph, DGCNN for 3D point cloud, alongside LSTM for language, and demonstrate that, even with a
arXiv Detail & Related papers (2023-01-30T14:02:33Z) - Truveta Mapper: A Zero-shot Ontology Alignment Framework [3.5284865194805106]
A new perspective is suggested for unsupervised Ontology Matching (OM) or Ontology Alignment (OA)
The proposed framework, Truveta Mapper (TM), leverages a multi-task sequence-to-sequence transformer model to perform alignment across multiple in a zero-shot, unified and end-to-end manner.
TM is pre-trained and fine-tuned only on publicly available corpus text and inner-ontologies data.
arXiv Detail & Related papers (2023-01-24T00:32:56Z) - End-to-end Semantic Role Labeling with Neural Transition-based Model [25.921541005563856]
End-to-end semantic role labeling (SRL) has been received increasing interest.
Recent work is mostly focused on graph-based neural models.
We present the first work of transition-based neural models for end-to-end SRL.
arXiv Detail & Related papers (2021-01-02T07:35:54Z) - Transition-based Semantic Dependency Parsing with Pointer Networks [0.34376560669160383]
We propose a transition system that can straightforwardly produce labelled directed acyclic graphs and perform semantic dependency parsing.
We enhance our approach with deep contextualized word embeddings extracted from BERT.
The resulting system not only outperforms all existing transition-based models, but also matches the best fully-supervised accuracy to date on SemEval 2015 18 English datasets.
arXiv Detail & Related papers (2020-05-27T13:18:27Z) - A Dependency Syntactic Knowledge Augmented Interactive Architecture for
End-to-End Aspect-based Sentiment Analysis [73.74885246830611]
We propose a novel dependency syntactic knowledge augmented interactive architecture with multi-task learning for end-to-end ABSA.
This model is capable of fully exploiting the syntactic knowledge (dependency relations and types) by leveraging a well-designed Dependency Relation Embedded Graph Convolutional Network (DreGcn)
Extensive experimental results on three benchmark datasets demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2020-04-04T14:59:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.