EA$^2$E: Improving Consistency with Event Awareness for Document-Level
Argument Extraction
- URL: http://arxiv.org/abs/2205.14847v1
- Date: Mon, 30 May 2022 04:33:51 GMT
- Title: EA$^2$E: Improving Consistency with Event Awareness for Document-Level
Argument Extraction
- Authors: Qi Zeng, Qiusi Zhan, Heng Ji
- Abstract summary: We introduce the Event-Aware Argument Extraction (EA$2$E) model with augmented context for training and inference.
Experiment results on WIKIEVENTS and ACE2005 datasets demonstrate the effectiveness of EA$2$E.
- Score: 52.43978926985928
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Events are inter-related in documents. Motivated by the
one-sense-per-discourse theory, we hypothesize that a participant tends to play
consistent roles across multiple events in the same document. However recent
work on document-level event argument extraction models each individual event
in isolation and therefore causes inconsistency among extracted arguments
across events, which will further cause discrepancy for downstream applications
such as event knowledge base population, question answering, and hypothesis
generation. In this work, we formulate event argument consistency as the
constraints from event-event relations under the document-level setting. To
improve consistency we introduce the Event-Aware Argument Extraction (EA$^2$E)
model with augmented context for training and inference. Experiment results on
WIKIEVENTS and ACE2005 datasets demonstrate the effectiveness of EA$^2$E
compared to baseline methods.
Related papers
- Beyond Single-Event Extraction: Towards Efficient Document-Level Multi-Event Argument Extraction [19.51890490853855]
We propose a multiple-event argument extraction model DEEIA.
It is capable of extracting arguments from all events within a document simultaneously.
Our method achieves new state-of-the-art performance on four public datasets.
arXiv Detail & Related papers (2024-05-03T07:04:35Z) - Complex Reasoning over Logical Queries on Commonsense Knowledge Graphs [61.796960984541464]
We present COM2 (COMplex COMmonsense), a new dataset created by sampling logical queries.
We verbalize them using handcrafted rules and large language models into multiple-choice and text generation questions.
Experiments show that language models trained on COM2 exhibit significant improvements in complex reasoning ability.
arXiv Detail & Related papers (2024-03-12T08:13:52Z) - Type-aware Decoding via Explicitly Aggregating Event Information for
Document-level Event Extraction [11.432496741340334]
Document-level event extraction faces two main challenges: arguments-scattering and multi-event.
This paper proposes a novel-based Explicitly Aggregating(SEA) model to address these limitations.
SEA aggregates event information into event type and role representations, enabling the decoding of event records based on specific type-aware representations.
arXiv Detail & Related papers (2023-10-16T15:10:42Z) - Event Causality Extraction with Event Argument Correlations [13.403222002600558]
Event Causality Extraction aims to extract cause-effect event causality pairs from plain texts.
We propose a method with a dual grid tagging scheme to capture the intra- and inter-event argument correlations for ECE.
arXiv Detail & Related papers (2023-01-27T09:48:31Z) - Unifying Event Detection and Captioning as Sequence Generation via
Pre-Training [53.613265415703815]
We propose a unified pre-training and fine-tuning framework to enhance the inter-task association between event detection and captioning.
Our model outperforms the state-of-the-art methods, and can be further boosted when pre-trained on extra large-scale video-text data.
arXiv Detail & Related papers (2022-07-18T14:18:13Z) - RAAT: Relation-Augmented Attention Transformer for Relation Modeling in
Document-Level Event Extraction [16.87868728956481]
We propose a new DEE framework which can model the relation dependencies, called Relation-augmented Document-level Event Extraction (ReDEE)
To further leverage relation information, we introduce a separate event relation prediction task and adopt multi-task learning method to explicitly enhance event extraction performance.
arXiv Detail & Related papers (2022-06-07T15:11:42Z) - ClarET: Pre-training a Correlation-Aware Context-To-Event Transformer
for Event-Centric Generation and Classification [74.6318379374801]
We propose to pre-train a general Correlation-aware context-to-Event Transformer (ClarET) for event-centric reasoning.
The proposed ClarET is applicable to a wide range of event-centric reasoning scenarios.
arXiv Detail & Related papers (2022-03-04T10:11:15Z) - Event Data Association via Robust Model Fitting for Event-based Object Tracking [66.05728523166755]
We propose a novel Event Data Association (called EDA) approach to explicitly address the event association and fusion problem.
The proposed EDA seeks for event trajectories that best fit the event data, in order to perform unifying data association and information fusion.
The experimental results show the effectiveness of EDA under challenging scenarios, such as high speed, motion blur, and high dynamic range conditions.
arXiv Detail & Related papers (2021-10-25T13:56:00Z) - Reinforcement Learning-based Dialogue Guided Event Extraction to Exploit
Argument Relations [70.35379323231241]
This paper presents a better approach for event extraction by explicitly utilizing the relationships of event arguments.
We employ reinforcement learning and incremental learning to extract multiple arguments via a multi-turned, iterative process.
Experimental results show that our approach consistently outperforms seven state-of-the-art event extraction methods.
arXiv Detail & Related papers (2021-06-23T13:24:39Z) - Document-level Event Extraction with Efficient End-to-end Learning of
Cross-event Dependencies [37.96254956540803]
We propose an end-to-end model leveraging Deep Value Networks (DVN), a structured prediction algorithm, to efficiently capture cross-event dependencies for document-level event extraction.
Our approach achieves comparable performance to CRF-based models on ACE05, while enjoys significantly higher computational efficiency.
arXiv Detail & Related papers (2020-10-24T05:28:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.