Prompt-based Graph Model for Joint Liberal Event Extraction and Event Schema Induction
- URL: http://arxiv.org/abs/2403.12526v1
- Date: Tue, 19 Mar 2024 07:56:42 GMT
- Title: Prompt-based Graph Model for Joint Liberal Event Extraction and Event Schema Induction
- Authors: Haochen Li, Di Geng,
- Abstract summary: Events are essential components of speech and texts, describing the changes in the state of entities.
The event extraction task aims to identify and classify events and find their participants according to event schemas.
The researchers propose Liberal Event Extraction (LEE), which aims to extract events and discover event schemas simultaneously.
- Score: 1.3154296174423619
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Events are essential components of speech and texts, describing the changes in the state of entities. The event extraction task aims to identify and classify events and find their participants according to event schemas. Manually predefined event schemas have limited coverage and are hard to migrate across domains. Therefore, the researchers propose Liberal Event Extraction (LEE), which aims to extract events and discover event schemas simultaneously. However, existing LEE models rely heavily on external language knowledge bases and require the manual development of numerous rules for noise removal and knowledge alignment, which is complex and laborious. To this end, we propose a Prompt-based Graph Model for Liberal Event Extraction (PGLEE). Specifically, we use a prompt-based model to obtain candidate triggers and arguments, and then build heterogeneous event graphs to encode the structures within and between events. Experimental results prove that our approach achieves excellent performance with or without predefined event schemas, while the automatically detected event schemas are proven high quality.
Related papers
- Cascading Large Language Models for Salient Event Graph Generation [19.731605612333716]
CALLMSAE is a CAscading Large Language Model framework for SAlient Event graph generation.
We first identify salient events by prompting LLMs to generate summaries.
We develop an iterative code refinement prompting strategy to generate event relation graphs.
Fine-tuning contextualised graph generation models on the LLM-generated graphs outperforms the models trained on CAEVO-generated data.
arXiv Detail & Related papers (2024-06-26T15:53:54Z) - EVIT: Event-Oriented Instruction Tuning for Event Reasoning [18.012724531672813]
Event reasoning aims to infer events according to certain relations and predict future events.
Large language models (LLMs) have made significant advancements in event reasoning owing to their wealth of knowledge and reasoning capabilities.
However, smaller instruction-tuned models currently in use do not consistently demonstrate exceptional proficiency in managing these tasks.
arXiv Detail & Related papers (2024-04-18T08:14:53Z) - Improving Event Definition Following For Zero-Shot Event Detection [66.27883872707523]
Existing approaches on zero-shot event detection usually train models on datasets annotated with known event types.
We aim to improve zero-shot event detection by training models to better follow event definitions.
arXiv Detail & Related papers (2024-03-05T01:46:50Z) - Drafting Event Schemas using Language Models [48.81285141287434]
We look at the process of creating such schemas to describe complex events.
Our focus is on whether we can achieve sufficient diversity and recall of key events.
We show that large language models are able to achieve moderate recall against schemas taken from two different datasets.
arXiv Detail & Related papers (2023-05-24T07:57:04Z) - Harvesting Event Schemas from Large Language Models [38.56772862516626]
Event schema provides a conceptual, structural and formal language to represent events and model the world event knowledge.
It is challenging to automatically induce high-quality and high-coverage event schemas due to the open nature of real-world events, the diversity of event expressions, and the sparsity of event knowledge.
We propose a new paradigm for event schema induction -- knowledge harvesting from large-scale pre-trained language models.
arXiv Detail & Related papers (2023-05-12T06:51:05Z) - Zero-Shot On-the-Fly Event Schema Induction [61.91468909200566]
We present a new approach in which large language models are utilized to generate source documents that allow predicting, given a high-level event definition, the specific events, arguments, and relations between them.
Using our model, complete schemas on any topic can be generated on-the-fly without any manual data collection, i.e., in a zero-shot manner.
arXiv Detail & Related papers (2022-10-12T14:37:00Z) - PILED: An Identify-and-Localize Framework for Few-Shot Event Detection [79.66042333016478]
In our study, we employ cloze prompts to elicit event-related knowledge from pretrained language models.
We minimize the number of type-specific parameters, enabling our model to quickly adapt to event detection tasks for new types.
arXiv Detail & Related papers (2022-02-15T18:01:39Z) - CLIP-Event: Connecting Text and Images with Event Structures [123.31452120399827]
We propose a contrastive learning framework to enforce vision-language pretraining models.
We take advantage of text information extraction technologies to obtain event structural knowledge.
Experiments show that our zero-shot CLIP-Event outperforms the state-of-the-art supervised model in argument extraction.
arXiv Detail & Related papers (2022-01-13T17:03:57Z) - Future is not One-dimensional: Graph Modeling based Complex Event Schema
Induction for Event Prediction [90.75260063651763]
We introduce the concept of Temporal Complex Event: a graph-based schema representation that encompasses events, arguments, temporal connections and argument relations.
We release a new schema learning corpus containing 6,399 documents accompanied with event graphs, and manually constructed gold schemas.
arXiv Detail & Related papers (2021-04-13T16:41:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.