Back to Prior Knowledge: Joint Event Causality Extraction via
Convolutional Semantic Infusion
- URL: http://arxiv.org/abs/2102.09923v1
- Date: Fri, 19 Feb 2021 13:31:46 GMT
- Title: Back to Prior Knowledge: Joint Event Causality Extraction via
Convolutional Semantic Infusion
- Authors: Zijian Wang, Hao Wang, Xiangfeng Luo, Jianqi Gao
- Abstract summary: Joint event and causality extraction is a challenging yet essential task in information retrieval and data mining.
We propose convolutional knowledge infusion for frequent n-grams with different windows of length within a joint extraction framework.
Our model significantly outperforms the strong BERT+CSNN baseline.
- Score: 5.566928318239452
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Joint event and causality extraction is a challenging yet essential task in
information retrieval and data mining. Recently, pre-trained language models
(e.g., BERT) yield state-of-the-art results and dominate in a variety of NLP
tasks. However, these models are incapable of imposing external knowledge in
domain-specific extraction. Considering the prior knowledge of frequent n-grams
that represent cause/effect events may benefit both event and causality
extraction, in this paper, we propose convolutional knowledge infusion for
frequent n-grams with different windows of length within a joint extraction
framework. Knowledge infusion during convolutional filter initialization not
only helps the model capture both intra-event (i.e., features in an event
cluster) and inter-event (i.e., associations across event clusters) features
but also boosts training convergence. Experimental results on the benchmark
datasets show that our model significantly outperforms the strong BERT+CSNN
baseline.
Related papers
- TacoERE: Cluster-aware Compression for Event Relation Extraction [47.89154684352463]
Event relation extraction is a critical and fundamental challenge for natural language processing.
We propose a cluster-aware compression method for improving event relation extraction (TacoERE)
arXiv Detail & Related papers (2024-05-11T03:06:08Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Boosting Event Extraction with Denoised Structure-to-Text Augmentation [52.21703002404442]
Event extraction aims to recognize pre-defined event triggers and arguments from texts.
Recent data augmentation methods often neglect the problem of grammatical incorrectness.
We propose a denoised structure-to-text augmentation framework for event extraction DAEE.
arXiv Detail & Related papers (2023-05-16T16:52:07Z) - Abnormal Event Detection via Hypergraph Contrastive Learning [54.80429341415227]
Abnormal event detection plays an important role in many real applications.
In this paper, we study the unsupervised abnormal event detection problem in Attributed Heterogeneous Information Network.
A novel hypergraph contrastive learning method, named AEHCL, is proposed to fully capture abnormal event patterns.
arXiv Detail & Related papers (2023-04-02T08:23:20Z) - Tokenization Consistency Matters for Generative Models on Extractive NLP
Tasks [54.306234256074255]
We identify the issue of tokenization inconsistency that is commonly neglected in training generative models.
This issue damages the extractive nature of these tasks after the input and output are tokenized inconsistently.
We show that, with consistent tokenization, the model performs better in both in-domain and out-of-domain datasets.
arXiv Detail & Related papers (2022-12-19T23:33:21Z) - Logic and Commonsense-Guided Temporal Knowledge Graph Completion [9.868206060374991]
A temporal knowledge graph (TKG) stores the events derived from the data involving time.
We propose a Logic and Commonsense-Guided Embedding model (LCGE) to jointly learn the time-sensitive representation involving timeliness and causality of events.
arXiv Detail & Related papers (2022-11-30T10:06:55Z) - Improve Event Extraction via Self-Training with Gradient Guidance [10.618929821822892]
We propose a Self-Training with Feedback (STF) framework to overcome the main factor that hinders the progress of event extraction.
STF consists of (1) a base event extraction model trained on existing event annotations and then applied to large-scale unlabeled corpora to predict new event mentions as pseudo training samples, and (2) a novel scoring model that takes in each new predicted event trigger, an argument, its argument role, as well as their paths in the AMR graph to estimate a compatibility score.
Experimental results on three benchmark datasets, including ACE05-E, ACE05-E+, and ERE
arXiv Detail & Related papers (2022-05-25T04:40:17Z) - Crude Oil-related Events Extraction and Processing: A Transfer Learning
Approach [0.7476901945542385]
This paper presents a complete framework for extracting and processing crude oil-related events found in CrudeOilNews corpus.
We place special emphasis on event properties (Polarity, Modality, and Intensity) classification to determine the factual certainty of each event.
arXiv Detail & Related papers (2022-05-01T03:21:18Z) - Improving Event Causality Identification via Self-Supervised
Representation Learning on External Causal Statement [17.77752074834281]
We propose CauSeRL, which leverages external causal statements for event causality identification.
First of all, we design a self-supervised framework to learn context-specific causal patterns from external causal statements.
We adopt a contrastive transfer strategy to incorporate the learned context-specific causal patterns into the target ECI model.
arXiv Detail & Related papers (2021-06-03T07:50:50Z) - Cross-Supervised Joint-Event-Extraction with Heterogeneous Information
Networks [61.950353376870154]
Joint-event-extraction is a sequence-to-sequence labeling task with a tag set composed of tags of triggers and entities.
We propose a Cross-Supervised Mechanism (CSM) to alternately supervise the extraction of triggers or entities.
Our approach outperforms the state-of-the-art methods in both entity and trigger extraction.
arXiv Detail & Related papers (2020-10-13T11:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.