Fusing Temporal Graphs into Transformers for Time-Sensitive Question
Answering
- URL: http://arxiv.org/abs/2310.19292v1
- Date: Mon, 30 Oct 2023 06:12:50 GMT
- Title: Fusing Temporal Graphs into Transformers for Time-Sensitive Question
Answering
- Authors: Xin Su, Phillip Howard, Nagib Hakim, Steven Bethard
- Abstract summary: Answering time-sensitive questions from long documents requires temporal reasoning over the times in questions and documents.
We apply existing temporal information extraction systems to construct temporal graphs of events, times, and temporal relations in questions and documents.
Experimental results show that our proposed approach for fusing temporal graphs into input text substantially enhances the temporal reasoning capabilities of Transformer models.
- Score: 11.810810214824183
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Answering time-sensitive questions from long documents requires temporal
reasoning over the times in questions and documents. An important open question
is whether large language models can perform such reasoning solely using a
provided text document, or whether they can benefit from additional temporal
information extracted using other systems. We address this research question by
applying existing temporal information extraction systems to construct temporal
graphs of events, times, and temporal relations in questions and documents. We
then investigate different approaches for fusing these graphs into Transformer
models. Experimental results show that our proposed approach for fusing
temporal graphs into input text substantially enhances the temporal reasoning
capabilities of Transformer models with or without fine-tuning. Additionally,
our proposed method outperforms various graph convolution-based approaches and
establishes a new state-of-the-art performance on SituatedQA and three splits
of TimeQA.
Related papers
- MTGER: Multi-view Temporal Graph Enhanced Temporal Reasoning over
Time-Involved Document [26.26604509399347]
MTGER is a novel framework for temporal reasoning over time-involved documents.
It explicitly models the temporal relationships among facts by multi-view temporal graphs.
We show that MTGER gives more consistent answers under question perturbations.
arXiv Detail & Related papers (2023-11-08T16:41:37Z) - Time-aware Multiway Adaptive Fusion Network for Temporal Knowledge Graph
Question Answering [10.170042914522778]
We propose a novel textbfTime-aware textbfMultiway textbfAdaptive (textbfTMA) fusion network.
For each given question, TMA first extracts the relevant concepts from the KG, and then feeds them into a multiway adaptive module.
This representation can be incorporated with the pre-trained KG embedding to generate the final prediction.
arXiv Detail & Related papers (2023-02-24T09:29:40Z) - Generic Temporal Reasoning with Differential Analysis and Explanation [61.96034987217583]
We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
arXiv Detail & Related papers (2022-12-20T17:40:03Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Temporal Knowledge Graph Reasoning with Low-rank and Model-agnostic
Representations [1.8262547855491458]
We introduce Time-LowFER, a family of parameter-efficient and time-aware extensions of the low-rank tensor factorization model LowFER.
Noting several limitations in current approaches to represent time, we propose a cycle-aware time-encoding scheme for time features.
We implement our methods in a unified temporal knowledge graph embedding framework, focusing on time-sensitive data processing.
arXiv Detail & Related papers (2022-04-10T22:24:11Z) - HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text
Extractive Summarization [57.798070356553936]
HETFORMER is a Transformer-based pre-trained model with multi-granularity sparse attentions for extractive summarization.
Experiments on both single- and multi-document summarization tasks show that HETFORMER achieves state-of-the-art performance in Rouge F1.
arXiv Detail & Related papers (2021-10-12T22:42:31Z) - A Dataset for Answering Time-Sensitive Questions [88.95075983560331]
Time is an important dimension in our physical world. Lots of facts can evolve with respect to time.
It is important to consider the time dimension and empower the existing QA models to reason over time.
The existing QA datasets contain rather few time-sensitive questions, hence not suitable for diagnosing or benchmarking the model's temporal reasoning capability.
arXiv Detail & Related papers (2021-08-13T16:42:25Z) - Interpretable Feature Construction for Time Series Extrinsic Regression [0.028675177318965035]
In some application domains, it occurs that the target variable is numerical and the problem is known as time series extrinsic regression (TSER)
We suggest an extension of a Bayesian method for robust and interpretable feature construction and selection in the context of TSER.
Our approach exploits a relational way to tackle with TSER: (i), we build various and simple representations of the time series which are stored in a relational data scheme, then, (ii), a propositionalisation technique is applied to build interpretable features from secondary tables to "flatten" the data.
arXiv Detail & Related papers (2021-03-15T08:12:19Z) - Software Engineering Event Modeling using Relative Time in Temporal
Knowledge Graphs [15.22542676866305]
We present a multi-relational temporal knowledge graph based on the daily interactions between artifacts in GitHub.
We introduce two new datasets for i) interpolated time-conditioned link prediction and ii) extrapolated time-conditioned link/time prediction queries.
Our experiments on these datasets highlight the potential of adapting knowledge graphs to answer broad software engineering questions.
arXiv Detail & Related papers (2020-07-02T16:28:43Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.