Generic Temporal Reasoning with Differential Analysis and Explanation
- URL: http://arxiv.org/abs/2212.10467v2
- Date: Wed, 31 May 2023 17:54:08 GMT
- Title: Generic Temporal Reasoning with Differential Analysis and Explanation
- Authors: Yu Feng, Ben Zhou, Haoyu Wang, Helen Jin, Dan Roth
- Abstract summary: We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
- Score: 61.96034987217583
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal reasoning is the task of predicting temporal relations of event
pairs. While temporal reasoning models can perform reasonably well on in-domain
benchmarks, we have little idea of these systems' generalizability due to
existing datasets' limitations. In this work, we introduce a novel task named
TODAY that bridges this gap with temporal differential analysis, which as the
name suggests, evaluates whether systems can correctly understand the effect of
incremental changes. Specifically, TODAY introduces slight contextual changes
for given event pairs, and systems are asked to tell how this subtle contextual
change would affect relevant temporal relation distributions. To facilitate
learning, TODAY also annotates human explanations. We show that existing
models, including GPT-3.5, drop to random guessing on TODAY, suggesting that
they heavily rely on spurious information rather than proper reasoning for
temporal predictions. On the other hand, we show that TODAY's supervision style
and explanation annotations can be used in joint learning, encouraging models
to use more appropriate signals during training and thus outperform across
several benchmarks. TODAY can also be used to train models to solicit
incidental supervision from noisy sources such as GPT-3.5, thus moving us more
toward the goal of generic temporal reasoning systems.
Related papers
- XForecast: Evaluating Natural Language Explanations for Time Series Forecasting [72.57427992446698]
Time series forecasting aids decision-making, especially for stakeholders who rely on accurate predictions.
Traditional explainable AI (XAI) methods, which underline feature or temporal importance, often require expert knowledge.
evaluating forecast NLEs is difficult due to the complex causal relationships in time series data.
arXiv Detail & Related papers (2024-10-18T05:16:39Z) - Prompt Learning on Temporal Interaction Graphs [25.28535762085367]
Temporal Interaction Graphs (TIGs) are widely utilized to represent real-world systems.
TIG models are facing tough gaps between the pre-training and downstream predictions in their pre-train, predict'' training paradigm.
We introduce Temporal Interaction GraphPrompting (TIGPrompt), a versatile framework that seamlessly integrates with TIG models.
arXiv Detail & Related papers (2024-02-09T11:06:20Z) - TEILP: Time Prediction over Knowledge Graphs via Logical Reasoning [14.480267340831542]
We propose TEILP, a logical reasoning framework that naturally integrates temporal elements into knowledge graph predictions.
We first convert TKGs into a temporal event knowledge graph (TEKG) which has a more explicit representation of time in term of nodes of the graph.
Finally, we introduce conditional probability density functions, associated with the logical rules involving the query interval, using which we arrive at the time prediction.
arXiv Detail & Related papers (2023-12-25T21:54:56Z) - Learning Multi-graph Structure for Temporal Knowledge Graph Reasoning [3.3571415078869955]
This paper proposes an innovative reasoning approach that focuses on Learning Multi-graph Structure (LMS)
LMS incorporates an adaptive gate for merging entity representations both along and across timestamps effectively.
It also integrates timestamp semantics into graph attention calculations and time-aware decoders.
arXiv Detail & Related papers (2023-12-04T08:23:09Z) - Temporal Smoothness Regularisers for Neural Link Predictors [8.975480841443272]
We show that a simple method like TNTComplEx can produce significantly more accurate results than state-of-the-art methods.
We also evaluate the impact of a wide range of temporal smoothing regularisers on two state-of-the-art temporal link prediction models.
arXiv Detail & Related papers (2023-09-16T16:52:49Z) - Self-Interpretable Time Series Prediction with Counterfactual
Explanations [4.658166900129066]
Interpretable time series prediction is crucial for safety-critical areas such as healthcare and autonomous driving.
Most existing methods focus on interpreting predictions by assigning important scores to segments of time series.
We develop a self-interpretable model, dubbed Counterfactual Time Series (CounTS), which generates counterfactual and actionable explanations for time series predictions.
arXiv Detail & Related papers (2023-06-09T16:42:52Z) - Instructed Diffuser with Temporal Condition Guidance for Offline
Reinforcement Learning [71.24316734338501]
We propose an effective temporally-conditional diffusion model coined Temporally-Composable diffuser (TCD)
TCD extracts temporal information from interaction sequences and explicitly guides generation with temporal conditions.
Our method reaches or matches the best performance compared with prior SOTA baselines.
arXiv Detail & Related papers (2023-06-08T02:12:26Z) - Avoiding Inference Heuristics in Few-shot Prompt-based Finetuning [57.4036085386653]
We show that prompt-based models for sentence pair classification tasks still suffer from a common pitfall of adopting inferences based on lexical overlap.
We then show that adding a regularization that preserves pretraining weights is effective in mitigating this destructive tendency of few-shot finetuning.
arXiv Detail & Related papers (2021-09-09T10:10:29Z) - Interpretable Time-series Representation Learning With Multi-Level
Disentanglement [56.38489708031278]
Disentangle Time Series (DTS) is a novel disentanglement enhancement framework for sequential data.
DTS generates hierarchical semantic concepts as the interpretable and disentangled representation of time-series.
DTS achieves superior performance in downstream applications, with high interpretability of semantic concepts.
arXiv Detail & Related papers (2021-05-17T22:02:24Z) - Temporal Reasoning on Implicit Events from Distant Supervision [91.20159064951487]
We propose a novel temporal reasoning dataset that evaluates the degree to which systems understand implicit events.
We find that state-of-the-art models struggle when predicting temporal relationships between implicit and explicit events.
We propose a neuro-symbolic temporal reasoning model, SYMTIME, which exploits distant supervision signals from large-scale text and uses temporal rules to infer end times.
arXiv Detail & Related papers (2020-10-24T03:12:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.