A Survey on Temporal Reasoning for Temporal Information Extraction from
Text (Extended Abstract)
- URL: http://arxiv.org/abs/2005.06527v2
- Date: Fri, 15 May 2020 11:35:51 GMT
- Title: A Survey on Temporal Reasoning for Temporal Information Extraction from
Text (Extended Abstract)
- Authors: Artuur Leeuwenberg, Marie-Francine Moens
- Abstract summary: Temporal reasoning plays a central role in temporal information extraction.
This article presents a comprehensive survey of the research on temporal reasoning for automatic temporal information extraction from text.
It provides a case study on the integration of symbolic reasoning with machine learning-based information extraction systems.
- Score: 21.62977556227642
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time is deeply woven into how people perceive, and communicate about the
world. Almost unconsciously, we provide our language utterances with temporal
cues, like verb tenses, and we can hardly produce sentences without such cues.
Extracting temporal cues from text, and constructing a global temporal view
about the order of described events is a major challenge of automatic natural
language understanding. Temporal reasoning, the process of combining different
temporal cues into a coherent temporal view, plays a central role in temporal
information extraction. This article presents a comprehensive survey of the
research from the past decades on temporal reasoning for automatic temporal
information extraction from text, providing a case study on the integration of
symbolic reasoning with machine learning-based information extraction systems.
Related papers
- Enhancing Temporal Sensitivity and Reasoning for Time-Sensitive Question Answering [23.98067169669452]
Time-Sensitive Question Answering (TSQA) demands the effective utilization of specific temporal contexts.
We propose a novel framework that enhances temporal awareness and reasoning through Temporal Information-Aware Embedding and Granular Contrastive Reinforcement Learning.
arXiv Detail & Related papers (2024-09-25T13:13:21Z) - Analysis of Plan-based Retrieval for Grounded Text Generation [78.89478272104739]
hallucinations occur when a language model is given a generation task outside its parametric knowledge.
A common strategy to address this limitation is to infuse the language models with retrieval mechanisms.
We analyze how planning can be used to guide retrieval to further reduce the frequency of hallucinations.
arXiv Detail & Related papers (2024-08-20T02:19:35Z) - Subspace Chronicles: How Linguistic Information Emerges, Shifts and
Interacts during Language Model Training [56.74440457571821]
We analyze tasks covering syntax, semantics and reasoning, across 2M pre-training steps and five seeds.
We identify critical learning phases across tasks and time, during which subspaces emerge, share information, and later disentangle to specialize.
Our findings have implications for model interpretability, multi-task learning, and learning from limited data.
arXiv Detail & Related papers (2023-10-25T09:09:55Z) - An Overview Of Temporal Commonsense Reasoning and Acquisition [20.108317515225504]
Temporal commonsense reasoning refers to the ability to understand the typical temporal context of phrases, actions, and events.
Recent research on the performance of large language models suggests that they often take shortcuts in their reasoning and fall prey to simple linguistic traps.
arXiv Detail & Related papers (2023-07-28T01:30:15Z) - Unlocking Temporal Question Answering for Large Language Models with Tailor-Made Reasoning Logic [84.59255070520673]
Large language models (LLMs) face a challenge when engaging in temporal reasoning.
We propose TempLogic, a novel framework designed specifically for temporal question-answering tasks.
arXiv Detail & Related papers (2023-05-24T10:57:53Z) - Fuzzy Temporal Protoforms for the Quantitative Description of Processes
in Natural Language [0.0]
The model includes temporal and causal information from processes and attributes, quantifies attributes in time during the process life-span and recalls causal relations and temporal distances between events.
A real use-case in the cardiology domain is presented, showing the potential of our model for providing natural language explanations addressed to domain experts.
arXiv Detail & Related papers (2023-05-16T14:59:38Z) - Follow the Timeline! Generating Abstractive and Extractive Timeline
Summary in Chronological Order [78.46986998674181]
We propose a Unified Timeline Summarizer (UTS) that can generate abstractive and extractive timeline summaries in time order.
We augment the previous Chinese large-scale timeline summarization dataset and collect a new English timeline dataset.
UTS achieves state-of-the-art performance in terms of both automatic and human evaluations.
arXiv Detail & Related papers (2023-01-02T20:29:40Z) - Open-set Text Recognition via Character-Context Decoupling [16.2819099852748]
The open-set text recognition task is an emerging challenge that requires an extra capability to cognize novel characters during evaluation.
We argue that a major cause of the limited performance for current methods is the confounding effect of contextual information over the visual information of individual characters.
A Character-Context Decoupling framework is proposed to alleviate this problem by separating contextual information and character-visual information.
arXiv Detail & Related papers (2022-04-12T05:43:46Z) - Temporal Reasoning on Implicit Events from Distant Supervision [91.20159064951487]
We propose a novel temporal reasoning dataset that evaluates the degree to which systems understand implicit events.
We find that state-of-the-art models struggle when predicting temporal relationships between implicit and explicit events.
We propose a neuro-symbolic temporal reasoning model, SYMTIME, which exploits distant supervision signals from large-scale text and uses temporal rules to infer end times.
arXiv Detail & Related papers (2020-10-24T03:12:27Z) - Temporal Common Sense Acquisition with Minimal Supervision [77.8308414884754]
This work proposes a novel sequence modeling approach that exploits explicit and implicit mentions of temporal common sense.
Our method is shown to give quality predictions of various dimensions of temporal common sense.
It also produces representations of events for relevant tasks such as duration comparison, parent-child relations, event coreference and temporal QA.
arXiv Detail & Related papers (2020-05-08T22:20:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.