Neural multi-event forecasting on spatio-temporal point processes using
probabilistically enriched transformers
- URL: http://arxiv.org/abs/2211.02922v1
- Date: Sat, 5 Nov 2022 14:55:36 GMT
- Title: Neural multi-event forecasting on spatio-temporal point processes using
probabilistically enriched transformers
- Authors: Negar Erfanian, Santiago Segarra, Maarten de Hoop
- Abstract summary: Predicting discrete events in time and space has many scientific applications, such as predicting hazardous earthquakes and outbreaks of infectious diseases.
In this architecture we propose a new neural multi-event forecasting of-temporal point processes, utilizing augmented with normalizing flows and layers.
Our network makes batched predictions of complex future discrete events, achieving state-the-art performance on a variety of benchmark datasets.
- Score: 18.66217537327045
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Predicting discrete events in time and space has many scientific
applications, such as predicting hazardous earthquakes and outbreaks of
infectious diseases. History-dependent spatio-temporal Hawkes processes are
often used to mathematically model these point events. However, previous
approaches have faced numerous challenges, particularly when attempting to
forecast one or multiple future events. In this work, we propose a new neural
architecture for multi-event forecasting of spatio-temporal point processes,
utilizing transformers, augmented with normalizing flows and probabilistic
layers. Our network makes batched predictions of complex history-dependent
spatio-temporal distributions of future discrete events, achieving
state-of-the-art performance on a variety of benchmark datasets including the
South California Earthquakes, Citibike, Covid-19, and Hawkes synthetic pinwheel
datasets. More generally, we illustrate how our network can be applied to any
dataset of discrete events with associated markers, even when no underlying
physics is known.
Related papers
- XTSFormer: Cross-Temporal-Scale Transformer for Irregular Time Event
Prediction [9.240950990926796]
Event prediction aims to forecast the time and type of a future event based on a historical event sequence.
Despite its significance, several challenges exist, including the irregularity of time intervals between consecutive events, the existence of cycles, periodicity, and multi-scale event interactions.
arXiv Detail & Related papers (2024-02-03T20:33:39Z) - Variational Neural Temporal Point Process [22.396329275957996]
A temporal point process is a process that predicts which type of events is likely to happen and when the event will occur.
We introduce the inference and the generative networks, and train a distribution of latent variable to deal with property on deep neural network.
We empirically demonstrate that our model can generalize the representations of various event types.
arXiv Detail & Related papers (2022-02-17T13:34:30Z) - Bayesian Neural Hawkes Process for Event Uncertainty Prediction [0.2148535041822524]
Models for predicting time of occurrence play a significant role in a diverse set of applications like social networks, financial transactions, healthcare, and human mobility.
Recent works have introduced neural network based point process for modeling event-times, and were shown to provide state-of-the-art performance in predicting event-times.
We propose a novel point process model, Bayesian Neural Hawkes process which leverages uncertainty modelling capability of Bayesian models and generalization capability of the neural networks.
arXiv Detail & Related papers (2021-12-29T09:47:22Z) - Dynamic Hawkes Processes for Discovering Time-evolving Communities'
States behind Diffusion Processes [57.22860407362061]
We propose a novel Hawkes process model that is able to capture the underlying dynamics of community states behind the diffusion processes.
The proposed method, termed DHP, offers a flexible way to learn complex representations of the time-evolving communities' states.
arXiv Detail & Related papers (2021-05-24T08:35:48Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Neural Datalog Through Time: Informed Temporal Modeling via Logical
Specification [30.398163018363043]
Training an unrestricted neural model might overfit to spurious patterns.
We propose using a temporal deductive database to track structured facts over time.
In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction.
arXiv Detail & Related papers (2020-06-30T12:26:04Z) - Graph Hawkes Neural Network for Forecasting on Temporal Knowledge Graphs [38.56057203198837]
Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.
We propose the Graph Hawkes Neural Network that can capture the dynamics of evolving graph sequences and can predict the occurrence of a fact in a future time instance.
arXiv Detail & Related papers (2020-03-30T12:56:50Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.