Hierarchical Event Grounding
- URL: http://arxiv.org/abs/2302.04197v1
- Date: Wed, 8 Feb 2023 17:09:41 GMT
- Title: Hierarchical Event Grounding
- Authors: Jiefu Ou, Adithya Pratapa, Rishubh Gupta, Teruko Mitamura
- Abstract summary: Event grounding aims at linking mention references in text to events from a knowledge base (KB)
We propose a retrieval methodology that leverages event hierarchy through an auxiliary hierarchical loss.
- Score: 7.561459516972332
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event grounding aims at linking mention references in text corpora to events
from a knowledge base (KB). Previous work on this task focused primarily on
linking to a single KB event, thereby overlooking the hierarchical aspects of
events. Events in documents are typically described at various levels of
spatio-temporal granularity (Glavas et al. 2014). These hierarchical relations
are utilized in downstream tasks of narrative understanding and schema
construction. In this work, we present an extension to the event grounding task
that requires tackling hierarchical event structures from the KB. Our proposed
task involves linking a mention reference to a set of event labels from a
subevent hierarchy in the KB. We propose a retrieval methodology that leverages
event hierarchy through an auxiliary hierarchical loss (Murty et al. 2018). On
an automatically created multilingual dataset from Wikipedia and Wikidata, our
experiments demonstrate the effectiveness of the hierarchical loss against
retrieve and re-rank baselines (Wu et al. 2020; Pratapa, Gupta, and Mitamura
2022). Furthermore, we demonstrate the systems' ability to aid hierarchical
discovery among unseen events.
Related papers
- Enhancing Cross-Document Event Coreference Resolution by Discourse Structure and Semantic Information [33.21818213257603]
Cross-document event coreference resolution models can only compute mention similarity directly or enhance mention representation by extracting event arguments.
We propose the construction of document-level Rhetorical Structure Theory (RST) trees and cross-document Lexical Chains to model the structural and semantic information of documents.
We have developed a large-scale Chinese cross-document event coreference dataset to fill this gap.
arXiv Detail & Related papers (2024-06-23T02:54:48Z) - Event GDR: Event-Centric Generative Document Retrieval [37.53593254200252]
We propose Event GDR, an event-centric generative document retrieval model.
We employ events and relations to model the document to guarantee the comprehensiveness and inner-content correlation.
For identifier construction, we map the events to well-defined event taxonomy to construct the identifiers with explicit semantic structure.
arXiv Detail & Related papers (2024-05-11T02:55:11Z) - Argument-Aware Approach To Event Linking [47.424863133787575]
Event linking connects event mentions in text with relevant nodes in a knowledge base (KB)
We improve event linking models by augmenting input text with tagged event argument information.
We synthesize out-of-KB training examples from in-KB instances through controlled manipulation of event arguments.
arXiv Detail & Related papers (2024-03-22T10:32:43Z) - Open-Domain Hierarchical Event Schema Induction by Incremental Prompting
and Verification [81.17473088621209]
We treat event schemas as a form of commonsense knowledge that can be derived from large language models (LLMs)
We design an incremental prompting and verification method to break down the construction of a complex event graph into three stages.
Compared to directly using LLMs to generate a linearized graph, our method can generate large and complex schemas with 7.2% F1 improvement in temporal relations and 31.0% F1 improvement in hierarchical relations.
arXiv Detail & Related papers (2023-07-05T01:00:44Z) - Association Graph Learning for Multi-Task Classification with Category
Shifts [68.58829338426712]
We focus on multi-task classification, where related classification tasks share the same label space and are learned simultaneously.
We learn an association graph to transfer knowledge among tasks for missing classes.
Our method consistently performs better than representative baselines.
arXiv Detail & Related papers (2022-10-10T12:37:41Z) - Beyond Grounding: Extracting Fine-Grained Event Hierarchies Across
Modalities [43.048896440009784]
We propose the task of extracting event hierarchies from multimodal (video and text) data.
This reveals the structure of events and is critical to understanding them.
We show the limitations of state-of-the-art unimodal and multimodal baselines on this task.
arXiv Detail & Related papers (2022-06-14T23:24:15Z) - Unsupervised Key Event Detection from Massive Text Corpora [42.31889135421941]
We propose a new task, key event detection at the intermediate level, aiming to detect from a news corpus key events.
This task can bridge event understanding and structuring and is inherently challenging because of the thematic and temporal closeness of key events.
We develop an unsupervised key event detection framework, EvMine, that extracts temporally frequent peak phrases using a novel ttf-itf score.
arXiv Detail & Related papers (2022-06-08T20:31:02Z) - Use All The Labels: A Hierarchical Multi-Label Contrastive Learning
Framework [75.79736930414715]
We present a hierarchical multi-label representation learning framework that can leverage all available labels and preserve the hierarchical relationship between classes.
We introduce novel hierarchy preserving losses, which jointly apply a hierarchical penalty to the contrastive loss, and enforce the hierarchy constraint.
arXiv Detail & Related papers (2022-04-27T21:41:44Z) - Event Linking: Grounding Event Mentions to Wikipedia [63.087102209379864]
This work defines Event Linking, a new natural language understanding task at the event level.
Event linking tries to link an event mention, appearing in a news article for example, to the most appropriate Wikipedia page.
arXiv Detail & Related papers (2021-12-15T05:06:18Z) - Rank-based loss for learning hierarchical representations [7.421724671710886]
In machine learning, the family of methods that use the 'extra' information is called hierarchical classification.
Here we focus on how to integrate the hierarchical information of a problem to learn embeddings representative of the hierarchical relationships.
We show that rank based loss is suitable to learn hierarchical representations of the data.
arXiv Detail & Related papers (2021-10-11T10:32:45Z) - Exploring the Hierarchy in Relation Labels for Scene Graph Generation [75.88758055269948]
The proposed method can improve several state-of-the-art baselines by a large margin (up to $33%$ relative gain) in terms of Recall@50.
Experiments show that the proposed simple yet effective method can improve several state-of-the-art baselines by a large margin.
arXiv Detail & Related papers (2020-09-12T17:36:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.