Temporal Common Sense Acquisition with Minimal Supervision
- URL: http://arxiv.org/abs/2005.04304v1
- Date: Fri, 8 May 2020 22:20:16 GMT
- Title: Temporal Common Sense Acquisition with Minimal Supervision
- Authors: Ben Zhou and Qiang Ning and Daniel Khashabi and Dan Roth
- Abstract summary: This work proposes a novel sequence modeling approach that exploits explicit and implicit mentions of temporal common sense.
Our method is shown to give quality predictions of various dimensions of temporal common sense.
It also produces representations of events for relevant tasks such as duration comparison, parent-child relations, event coreference and temporal QA.
- Score: 77.8308414884754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal common sense (e.g., duration and frequency of events) is crucial for
understanding natural language. However, its acquisition is challenging, partly
because such information is often not expressed explicitly in text, and human
annotation on such concepts is costly. This work proposes a novel sequence
modeling approach that exploits explicit and implicit mentions of temporal
common sense, extracted from a large corpus, to build TACOLM, a temporal common
sense language model. Our method is shown to give quality predictions of
various dimensions of temporal common sense (on UDST and a newly collected
dataset from RealNews). It also produces representations of events for relevant
tasks such as duration comparison, parent-child relations, event coreference
and temporal QA (on TimeBank, HiEVE and MCTACO) that are better than using the
standard BERT. Thus, it will be an important component of temporal NLP.
Related papers
- XForecast: Evaluating Natural Language Explanations for Time Series Forecasting [72.57427992446698]
Time series forecasting aids decision-making, especially for stakeholders who rely on accurate predictions.
Traditional explainable AI (XAI) methods, which underline feature or temporal importance, often require expert knowledge.
evaluating forecast NLEs is difficult due to the complex causal relationships in time series data.
arXiv Detail & Related papers (2024-10-18T05:16:39Z) - On the Role of Context in Reading Time Prediction [50.87306355705826]
We present a new perspective on how readers integrate context during real-time language comprehension.
Our proposals build on surprisal theory, which posits that the processing effort of a linguistic unit is an affine function of its in-context information content.
arXiv Detail & Related papers (2024-09-12T15:52:22Z) - Retrieval-Augmented Generation Meets Data-Driven Tabula Rasa Approach for Temporal Knowledge Graph Forecasting [0.0]
sLA-tKGF is a small-scale language assistant for temporal Knowledge Graph (tKG) forecasting.
Our framework constructs knowledge-infused prompts with historical data from tKGs and web search results.
It reduces hallucinations and mitigates distributional shift challenges through comprehending changing trends over time.
arXiv Detail & Related papers (2024-08-18T11:52:24Z) - Analyzing Temporal Complex Events with Large Language Models? A Benchmark towards Temporal, Long Context Understanding [57.62275091656578]
We refer to the complex events composed of many news articles over an extended period as Temporal Complex Event (TCE)
This paper proposes a novel approach using Large Language Models (LLMs) to systematically extract and analyze the event chain within TCE.
arXiv Detail & Related papers (2024-06-04T16:42:17Z) - An Overview Of Temporal Commonsense Reasoning and Acquisition [20.108317515225504]
Temporal commonsense reasoning refers to the ability to understand the typical temporal context of phrases, actions, and events.
Recent research on the performance of large language models suggests that they often take shortcuts in their reasoning and fall prey to simple linguistic traps.
arXiv Detail & Related papers (2023-07-28T01:30:15Z) - Generic Temporal Reasoning with Differential Analysis and Explanation [61.96034987217583]
We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
arXiv Detail & Related papers (2022-12-20T17:40:03Z) - ECOLA: Enhanced Temporal Knowledge Embeddings with Contextualized
Language Representations [35.51427298619691]
We study enhancing temporal knowledge embedding with textual data.
We propose Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations (ECOLA)
Experiments show that ECOLA significantly enhances temporal embedding models with up to 287% relative improvements regarding Hits@1 on the link prediction task.
arXiv Detail & Related papers (2022-03-17T20:08:25Z) - Temporal Reasoning on Implicit Events from Distant Supervision [91.20159064951487]
We propose a novel temporal reasoning dataset that evaluates the degree to which systems understand implicit events.
We find that state-of-the-art models struggle when predicting temporal relationships between implicit and explicit events.
We propose a neuro-symbolic temporal reasoning model, SYMTIME, which exploits distant supervision signals from large-scale text and uses temporal rules to infer end times.
arXiv Detail & Related papers (2020-10-24T03:12:27Z) - Modeling Preconditions in Text with a Crowd-sourced Dataset [17.828175478279654]
This paper introduces PeKo, a crowd-sourced annotation of preconditions between event pairs in newswire.
We also introduce two challenge tasks aimed at modeling preconditions.
Evaluation on both tasks shows that modeling preconditions is challenging even for today's large language models.
arXiv Detail & Related papers (2020-10-06T01:52:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.