The First Shared Task on Discourse Representation Structure Parsing
- URL: http://arxiv.org/abs/2005.13399v1
- Date: Wed, 27 May 2020 14:52:21 GMT
- Title: The First Shared Task on Discourse Representation Structure Parsing
- Authors: Lasha Abzianidze, Rik van Noord, Hessel Haagsma and Johan Bos
- Abstract summary: The paper presents the IWCS 2019 shared task on semantic parsing.
The goal is to produce Discourse Representation (DRS) for English sentences.
- Score: 3.8348270467895915
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The paper presents the IWCS 2019 shared task on semantic parsing where the
goal is to produce Discourse Representation Structures (DRSs) for English
sentences. DRSs originate from Discourse Representation Theory and represent
scoped meaning representations that capture the semantics of negation, modals,
quantification, and presupposition triggers. Additionally, concepts and
event-participants in DRSs are described with WordNet synsets and the thematic
roles from VerbNet. To measure similarity between two DRSs, they are
represented in a clausal form, i.e. as a set of tuples. Participant systems
were expected to produce DRSs in this clausal form. Taking into account the
rich lexical information, explicit scope marking, a high number of shared
variables among clauses, and highly-constrained format of valid DRSs, all these
makes the DRS parsing a challenging NLP task. The results of the shared task
displayed improvements over the existing state-of-the-art parser.
Related papers
- GDA: Generative Data Augmentation Techniques for Relation Extraction
Tasks [81.51314139202152]
We propose a dedicated augmentation technique for relational texts, named GDA, which uses two complementary modules to preserve both semantic consistency and syntax structures.
Experimental results in three datasets under a low-resource setting showed that GDA could bring em 2.0% F1 improvements compared with no augmentation technique.
arXiv Detail & Related papers (2023-05-26T06:21:01Z) - ParaAMR: A Large-Scale Syntactically Diverse Paraphrase Dataset by AMR
Back-Translation [59.91139600152296]
ParaAMR is a large-scale syntactically diverse paraphrase dataset created by abstract meaning representation back-translation.
We show that ParaAMR can be used to improve on three NLP tasks: learning sentence embeddings, syntactically controlled paraphrase generation, and data augmentation for few-shot learning.
arXiv Detail & Related papers (2023-05-26T02:27:33Z) - Dialogue Meaning Representation for Task-Oriented Dialogue Systems [51.91615150842267]
We propose Dialogue Meaning Representation (DMR), a flexible and easily extendable representation for task-oriented dialogue.
Our representation contains a set of nodes and edges with inheritance hierarchy to represent rich semantics for compositional semantics and task-specific concepts.
We propose two evaluation tasks to evaluate different machine learning based dialogue models, and further propose a novel coreference resolution model GNNCoref for the graph-based coreference resolution task.
arXiv Detail & Related papers (2022-04-23T04:17:55Z) - Cross-linguistically Consistent Semantic and Syntactic Annotation of Child-directed Speech [27.657676278734534]
This paper proposes a methodology for constructing such corpora of child directed speech paired with sentential logical forms.
The approach enforces a cross-linguistically consistent representation, building on recent advances in dependency representation and semantic parsing.
arXiv Detail & Related papers (2021-09-22T18:17:06Z) - Counterfactual Interventions Reveal the Causal Effect of Relative Clause
Representations on Agreement Prediction [61.4913233397155]
We show that BERT uses information about RC spans during agreement prediction using the linguistically strategy.
We also found that counterfactual representations generated for a specific RC subtype influenced the number prediction in sentences with other RC subtypes, suggesting that information about RC boundaries was encoded abstractly in BERT's representation.
arXiv Detail & Related papers (2021-05-14T17:11:55Z) - DRS at MRP 2020: Dressing up Discourse Representation Structures as
Graphs [4.21235641628176]
The paper describes the procedure of dressing up DRSs as directed labeled graphs to include DRT as a new framework.
The conversion procedure was biased towards making the DRT graph framework somewhat similar to other graph-based meaning representation frameworks.
arXiv Detail & Related papers (2020-12-29T16:36:49Z) - Pairwise Representation Learning for Event Coreference [73.10563168692667]
We develop a Pairwise Representation Learning (PairwiseRL) scheme for the event mention pairs.
Our representation supports a finer, structured representation of the text snippet to facilitate encoding events and their arguments.
We show that PairwiseRL, despite its simplicity, outperforms the prior state-of-the-art event coreference systems on both cross-document and within-document event coreference benchmarks.
arXiv Detail & Related papers (2020-10-24T06:55:52Z) - Conversational Semantic Parsing [50.954321571100294]
Session-based properties such as co-reference resolution and context carryover are processed downstream in a pipelined system.
We release a new session-based, compositional task-oriented parsing dataset of 20k sessions consisting of 60k utterances.
We propose a new family of Seq2Seq models for the session-based parsing above, which achieve better or comparable performance to the current state-of-the-art on ATIS, SNIPS, TOP and DSTC2.
arXiv Detail & Related papers (2020-09-28T22:08:00Z) - CIRCE at SemEval-2020 Task 1: Ensembling Context-Free and
Context-Dependent Word Representations [0.0]
We present an ensemble model that makes predictions based on context-free and context-dependent word representations.
The key findings are that (1) context-free word representations are a powerful and robust baseline, (2) a sentence classification objective can be used to obtain useful context-dependent word representations, and (3) combining those representations increases performance on some datasets while decreasing performance on others.
arXiv Detail & Related papers (2020-04-30T13:18:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.