A Multi-Channel Neural Graphical Event Model with Negative Evidence
- URL: http://arxiv.org/abs/2002.09575v1
- Date: Fri, 21 Feb 2020 23:10:50 GMT
- Title: A Multi-Channel Neural Graphical Event Model with Negative Evidence
- Authors: Tian Gao, Dharmashankar Subramanian, Karthikeyan Shanmugam, Debarun
Bhattacharjya, Nicholas Mattei
- Abstract summary: Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
- Score: 76.51278722190607
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event datasets are sequences of events of various types occurring irregularly
over the time-line, and they are increasingly prevalent in numerous domains.
Existing work for modeling events using conditional intensities rely on either
using some underlying parametric form to capture historical dependencies, or on
non-parametric models that focus primarily on tasks such as prediction. We
propose a non-parametric deep neural network approach in order to estimate the
underlying intensity functions. We use a novel multi-channel RNN that optimally
reinforces the negative evidence of no observable events with the introduction
of fake event epochs within each consecutive inter-event interval. We evaluate
our method against state-of-the-art baselines on model fitting tasks as gauged
by log-likelihood. Through experiments on both synthetic and real-world
datasets, we find that our proposed approach outperforms existing baselines on
most of the datasets studied.
Related papers
- Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - Deep graph kernel point processes [19.382241594513374]
This paper presents a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure.
The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure.
Compared with prior works focusing on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively.
arXiv Detail & Related papers (2023-06-20T06:15:19Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Event Data Association via Robust Model Fitting for Event-based Object Tracking [66.05728523166755]
We propose a novel Event Data Association (called EDA) approach to explicitly address the event association and fusion problem.
The proposed EDA seeks for event trajectories that best fit the event data, in order to perform unifying data association and information fusion.
The experimental results show the effectiveness of EDA under challenging scenarios, such as high speed, motion blur, and high dynamic range conditions.
arXiv Detail & Related papers (2021-10-25T13:56:00Z) - User-Dependent Neural Sequence Models for Continuous-Time Event Data [27.45413274751265]
Continuous-time event data are common in applications such as individual behavior data, financial transactions, and medical health records.
Recurrent neural networks that parameterize time-varying intensity functions are the current state-of-the-art for predictive modeling with such data.
In this paper, we extend the broad class of neural marked point process models to mixtures of latent embeddings.
arXiv Detail & Related papers (2020-11-06T08:32:57Z) - SMART: Simultaneous Multi-Agent Recurrent Trajectory Prediction [72.37440317774556]
We propose advances that address two key challenges in future trajectory prediction.
multimodality in both training data and predictions and constant time inference regardless of number of agents.
arXiv Detail & Related papers (2020-07-26T08:17:10Z) - Context-dependent self-exciting point processes: models, methods, and
risk bounds in high dimensions [21.760636228118607]
High-dimensional autoregressive point processes model how current events trigger or inhibit future events, such as activity by one member of a social network can affect the future activity of his or her neighbors.
We leverage ideas from compositional time series and regularization methods in machine learning to conduct network estimation for high-dimensional marked point processes.
arXiv Detail & Related papers (2020-03-16T20:22:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.