XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction
- URL: http://arxiv.org/abs/2402.02258v1
- Date: Sat, 3 Feb 2024 20:33:39 GMT
- Title: XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction
- Authors: Tingsong Xiao, Zelin Xu, Wenchong He, Jim Su, Yupu Zhang, Raymond
Opoku, Ronald Ison, Jason Petho, Jiang Bian, Patrick Tighe, Parisa Rashidi,
Zhe Jiang
- Abstract summary: Event prediction aims to forecast the time and type of a future event based on a historical event sequence.
Despite its significance, several challenges exist, including the irregularity of time intervals between consecutive events, the existence of cycles, periodicity, and multi-scale event interactions.
- Score: 9.240950990926796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event prediction aims to forecast the time and type of a future event based
on a historical event sequence. Despite its significance, several challenges
exist, including the irregularity of time intervals between consecutive events,
the existence of cycles, periodicity, and multi-scale event interactions, as
well as the high computational costs for long event sequences. Existing neural
temporal point processes (TPPs) methods do not capture the multi-scale nature
of event interactions, which is common in many real-world applications such as
clinical event data. To address these issues, we propose the
cross-temporal-scale transformer (XTSFormer), designed specifically for
irregularly timed event data. Our model comprises two vital components: a novel
Feature-based Cycle-aware Time Positional Encoding (FCPE) that adeptly captures
the cyclical nature of time, and a hierarchical multi-scale temporal attention
mechanism. These scales are determined by a bottom-up clustering algorithm.
Extensive experiments on several real-world datasets show that our XTSFormer
outperforms several baseline methods in prediction performance.
Related papers
- Scalable Event-by-event Processing of Neuromorphic Sensory Signals With Deep State-Space Models [2.551844666707809]
Event-based sensors are well suited for real-time processing.
Current methods either collapse events into frames or cannot scale up when processing the event data directly event-by-event.
arXiv Detail & Related papers (2024-04-29T08:50:27Z) - Deep Representation Learning for Prediction of Temporal Event Sets in
the Continuous Time Domain [9.71405768795797]
Temporal Point Processes play an important role in predicting or forecasting events.
We propose a scalable and efficient approach based on TPPs to solve this problem.
arXiv Detail & Related papers (2023-09-29T06:46:31Z) - Neural multi-event forecasting on spatio-temporal point processes using
probabilistically enriched transformers [18.66217537327045]
Predicting discrete events in time and space has many scientific applications, such as predicting hazardous earthquakes and outbreaks of infectious diseases.
In this architecture we propose a new neural multi-event forecasting of-temporal point processes, utilizing augmented with normalizing flows and layers.
Our network makes batched predictions of complex future discrete events, achieving state-the-art performance on a variety of benchmark datasets.
arXiv Detail & Related papers (2022-11-05T14:55:36Z) - HyperHawkes: Hypernetwork based Neural Temporal Point Process [5.607676459156789]
Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
arXiv Detail & Related papers (2022-10-01T07:14:19Z) - Modeling Continuous Time Sequences with Intermittent Observations using
Marked Temporal Point Processes [25.074394338483575]
A large fraction of data generated via human activities can be represented as a sequence of events over a continuous-time.
Deep learning models over these continuous-time event sequences is a non-trivial task.
In this work, we provide a novel unsupervised model and inference method for learning MTPP in presence of event sequences with missing events.
arXiv Detail & Related papers (2022-06-23T18:23:20Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Learning Temporal Rules from Noisy Timeseries Data [72.93572292157593]
We focus on uncovering the underlying atomic events and their relations that lead to the composite events within a noisy temporal data setting.
We propose a Neural Temporal Logic Programming (Neural TLP) which first learns implicit temporal relations between atomic events and then lifts logic rules for supervision.
arXiv Detail & Related papers (2022-02-11T01:29:02Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Multi-Scale One-Class Recurrent Neural Networks for Discrete Event
Sequence Anomaly Detection [63.825781848587376]
We propose OC4Seq, a one-class recurrent neural network for detecting anomalies in discrete event sequences.
Specifically, OC4Seq embeds the discrete event sequences into latent spaces, where anomalies can be easily detected.
arXiv Detail & Related papers (2020-08-31T04:48:22Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.