Generating Disentangled Arguments with Prompts: A Simple Event
Extraction Framework that Works
- URL: http://arxiv.org/abs/2110.04525v1
- Date: Sat, 9 Oct 2021 09:36:08 GMT
- Title: Generating Disentangled Arguments with Prompts: A Simple Event
Extraction Framework that Works
- Authors: Jinghui Si, Xutan Peng, Chen Li, Haotian Xu, Jianxin Li
- Abstract summary: Event Extraction bridges the gap between text and event signals.
We introduce the prompt-based learning strategy to the domain of Event Extraction.
In terms of F1 score on Argument Extraction, our simple architecture is stronger than any other generative counterpart.
- Score: 9.36117752274482
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event Extraction bridges the gap between text and event signals. Based on the
assumption of trigger-argument dependency, existing approaches have achieved
state-of-the-art performance with expert-designed templates or complicated
decoding constraints. In this paper, for the first time we introduce the
prompt-based learning strategy to the domain of Event Extraction, which
empowers the automatic exploitation of label semantics on both input and output
sides. To validate the effectiveness of the proposed generative method, we
conduct extensive experiments with 11 diverse baselines. Empirical results show
that, in terms of F1 score on Argument Extraction, our simple architecture is
stronger than any other generative counterpart and even competitive with
algorithms that require template engineering. Regarding the measure of recall,
it sets new overall records for both Argument and Trigger Extractions. We
hereby recommend this framework to the community, with the code publicly
available at https://git.io/GDAP.
Related papers
- Complex Reasoning over Logical Queries on Commonsense Knowledge Graphs [61.796960984541464]
We present COM2 (COMplex COMmonsense), a new dataset created by sampling logical queries.
We verbalize them using handcrafted rules and large language models into multiple-choice and text generation questions.
Experiments show that language models trained on COM2 exhibit significant improvements in complex reasoning ability.
arXiv Detail & Related papers (2024-03-12T08:13:52Z) - From Simple to Complex: A Progressive Framework for Document-level
Informative Argument Extraction [34.37013964529546]
Event Argument Extraction (EAE) requires the model to extract arguments of multiple events from a single document.
We propose a simple-to-complex progressive framework for document-level EAE.
Our model outperforms SOTA by 1.4% in F1, indicating the proposed simple-to-complex framework is useful in the EAE task.
arXiv Detail & Related papers (2023-10-25T04:38:02Z) - Global Constraints with Prompting for Zero-Shot Event Argument
Classification [49.84347224233628]
We propose to use global constraints with prompting to tackle event argument classification without any annotation and task-specific training.
A pre-trained language model scores the new passages, making the initial prediction.
Our novel prompt templates can easily adapt to all events and argument types without manual effort.
arXiv Detail & Related papers (2023-02-09T06:39:29Z) - Retrieval-Augmented Generative Question Answering for Event Argument
Extraction [66.24622127143044]
We propose a retrieval-augmented generative QA model (R-GQA) for event argument extraction.
It retrieves the most similar QA pair and augments it as prompt to the current example's context, then decodes the arguments as answers.
Our approach outperforms substantially prior methods across various settings.
arXiv Detail & Related papers (2022-11-14T02:00:32Z) - Dynamic Global Memory for Document-level Argument Extraction [63.314514124716936]
We introduce a new global neural generation-based framework for document-level event argument extraction.
We use a document memory store to record the contextual event information and leverage it to implicitly and explicitly help with decoding of arguments for later events.
Empirical results show that our framework outperforms prior methods substantially.
arXiv Detail & Related papers (2022-09-18T23:45:25Z) - EA$^2$E: Improving Consistency with Event Awareness for Document-Level
Argument Extraction [52.43978926985928]
We introduce the Event-Aware Argument Extraction (EA$2$E) model with augmented context for training and inference.
Experiment results on WIKIEVENTS and ACE2005 datasets demonstrate the effectiveness of EA$2$E.
arXiv Detail & Related papers (2022-05-30T04:33:51Z) - Efficient Document-level Event Extraction via Pseudo-Trigger-aware
Pruned Complete Graph [15.925704154438638]
We design a non-autoregressive decoding algorithm to perform event argument combination extraction on pruned complete graphs.
Compared to the previous systems, our system achieves lower resource consumption, taking only 3.6% GPU time (pfs-days) for training and up to 8.5 times faster for inference.
arXiv Detail & Related papers (2021-12-11T16:01:29Z) - Query and Extract: Refining Event Extraction as Type-oriented Binary
Decoding [51.57864297948228]
We propose a novel event extraction framework that takes event types and argument roles as natural language queries.
Our framework benefits from the attention mechanisms to better capture the semantic correlation between the event types or argument roles and the input text.
arXiv Detail & Related papers (2021-10-14T15:49:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.