Semantic Role Labeling as Syntactic Dependency Parsing
- URL: http://arxiv.org/abs/2010.11170v1
- Date: Wed, 21 Oct 2020 17:46:11 GMT
- Title: Semantic Role Labeling as Syntactic Dependency Parsing
- Authors: Tianze Shi, Igor Malioutov, Ozan \.Irsoy
- Abstract summary: Three common syntactic patterns account for over 98% of the PropBank-style semantic role labeling annotations.
We present a conversion scheme that packs SRL annotations into dependency tree representations through joint labels.
- Score: 19.919191146167584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We reduce the task of (span-based) PropBank-style semantic role labeling
(SRL) to syntactic dependency parsing. Our approach is motivated by our
empirical analysis that shows three common syntactic patterns account for over
98% of the SRL annotations for both English and Chinese data. Based on this
observation, we present a conversion scheme that packs SRL annotations into
dependency tree representations through joint labels that permit highly
accurate recovery back to the original format. This representation allows us to
train statistical dependency parsers to tackle SRL and achieve competitive
performance with the current state of the art. Our findings show the promise of
syntactic dependency trees in encoding semantic role relations within their
syntactic domain of locality, and point to potential further integration of
syntactic methods into semantic role labeling in the future.
Related papers
- Entity-Aware Self-Attention and Contextualized GCN for Enhanced Relation Extraction in Long Sentences [5.453850739960517]
We propose a novel model, Entity-aware Self-attention Contextualized GCN (ESC-GCN), which efficiently incorporates syntactic structure of input sentences and semantic context of sequences.
Our model achieves encouraging performance as compared to existing dependency-based and sequence-based models.
arXiv Detail & Related papers (2024-09-15T10:50:51Z) - A Hybrid Approach To Aspect Based Sentiment Analysis Using Transfer Learning [3.30307212568497]
We propose a hybrid approach for Aspect Based Sentiment Analysis using transfer learning.
The approach focuses on generating weakly-supervised annotations by exploiting the strengths of both large language models (LLM) and traditional syntactic dependencies.
arXiv Detail & Related papers (2024-03-25T23:02:33Z) - mCL-NER: Cross-Lingual Named Entity Recognition via Multi-view
Contrastive Learning [54.523172171533645]
Cross-lingual named entity recognition (CrossNER) faces challenges stemming from uneven performance due to the scarcity of multilingual corpora.
We propose Multi-view Contrastive Learning for Cross-lingual Named Entity Recognition (mCL-NER)
Our experiments on the XTREME benchmark, spanning 40 languages, demonstrate the superiority of mCL-NER over prior data-driven and model-based approaches.
arXiv Detail & Related papers (2023-08-17T16:02:29Z) - Semantic Role Labeling Meets Definition Modeling: Using Natural Language
to Describe Predicate-Argument Structures [104.32063681736349]
We present an approach to describe predicate-argument structures using natural language definitions instead of discrete labels.
Our experiments and analyses on PropBank-style and FrameNet-style, dependency-based and span-based SRL also demonstrate that a flexible model with an interpretable output does not necessarily come at the expense of performance.
arXiv Detail & Related papers (2022-12-02T11:19:16Z) - Transition-based Semantic Role Labeling with Pointer Networks [0.40611352512781856]
We propose the first transition-based SRL approach that is capable of completely processing an input sentence in a single left-to-right pass.
Thanks to our implementation based on Pointer Networks, full SRL can be accurately and efficiently done in $O(n2)$, achieving the best performance to date on the majority of languages from the CoNLL-2009 shared task.
arXiv Detail & Related papers (2022-05-20T08:38:44Z) - Combining Improvements for Exploiting Dependency Trees in Neural
Semantic Parsing [1.0437764544103274]
In this paper, we examine three methods to incorporate such dependency information in a Transformer based semantic parsing system.
We first replace standard self-attention heads in the encoder with parent-scaled self-attention (PASCAL) heads.
Later, we insert the constituent attention (CA) to the encoder, which adds an extra constraint to attention heads that can better capture the inherent dependency structure of input sentences.
arXiv Detail & Related papers (2021-12-25T03:41:42Z) - Image Synthesis via Semantic Composition [74.68191130898805]
We present a novel approach to synthesize realistic images based on their semantic layouts.
It hypothesizes that for objects with similar appearance, they share similar representation.
Our method establishes dependencies between regions according to their appearance correlation, yielding both spatially variant and associated representations.
arXiv Detail & Related papers (2021-09-15T02:26:07Z) - Syntax Role for Neural Semantic Role Labeling [77.5166510071142]
Semantic role labeling (SRL) is dedicated to recognizing the semantic predicate-argument structure of a sentence.
Previous studies in terms of traditional models have shown syntactic information can make remarkable contributions to SRL performance.
Recent neural SRL studies show that syntax information becomes much less important for neural semantic role labeling.
arXiv Detail & Related papers (2020-09-12T07:01:12Z) - Cross-lingual Semantic Role Labeling with Model Transfer [49.85316125365497]
Cross-lingual semantic role labeling can be achieved by model transfer under the help of universal features.
We propose an end-to-end SRL model that incorporates a variety of universal features and transfer methods.
arXiv Detail & Related papers (2020-08-24T09:37:45Z) - Unsupervised Transfer of Semantic Role Models from Verbal to Nominal
Domain [65.04669567781634]
We investigate a transfer scenario where we assume role-annotated data for the source verbal domain but only unlabeled data for the target nominal domain.
Our key assumption, enabling the transfer between the two domains, is that selectional preferences of a role do not strongly depend on whether the relation is triggered by a verb or a noun.
The method substantially outperforms baselines, such as unsupervised and direct transfer' methods, on the English CoNLL-2009 dataset.
arXiv Detail & Related papers (2020-05-01T09:20:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.