Neural Datalog Through Time: Informed Temporal Modeling via Logical
Specification
- URL: http://arxiv.org/abs/2006.16723v2
- Date: Mon, 17 Aug 2020 02:58:01 GMT
- Title: Neural Datalog Through Time: Informed Temporal Modeling via Logical
Specification
- Authors: Hongyuan Mei and Guanghui Qin and Minjie Xu and Jason Eisner
- Abstract summary: Training an unrestricted neural model might overfit to spurious patterns.
We propose using a temporal deductive database to track structured facts over time.
In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction.
- Score: 30.398163018363043
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning how to predict future events from patterns of past events is
difficult when the set of possible event types is large. Training an
unrestricted neural model might overfit to spurious patterns. To exploit
domain-specific knowledge of how past events might affect an event's present
probability, we propose using a temporal deductive database to track structured
facts over time. Rules serve to prove facts from other facts and from past
events. Each fact has a time-varying state---a vector computed by a neural net
whose topology is determined by the fact's provenance, including its experience
of past events. The possible event types at any time are given by special
facts, whose probabilities are neurally modeled alongside their states. In both
synthetic and real-world domains, we show that neural probabilistic models
derived from concise Datalog programs improve prediction by encoding
appropriate domain knowledge in their architecture.
Related papers
- Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Mitigating Temporal Misalignment by Discarding Outdated Facts [58.620269228776294]
Large language models are often used under temporal misalignment, tasked with answering questions about the present.
We propose fact duration prediction: the task of predicting how long a given fact will remain true.
Our data and code are released publicly at https://github.com/mikejqzhang/mitigating_misalignment.
arXiv Detail & Related papers (2023-05-24T07:30:08Z) - Who Should I Engage with At What Time? A Missing Event Aware Temporal
Graph Neural Network [4.770906657995415]
We propose MTGN, a missing event-aware temporal graph neural network.
We show that MTGN significantly outperforms existing methods with up to 89% and 112% more accurate time and link prediction.
arXiv Detail & Related papers (2023-01-20T02:22:55Z) - TempSAL -- Uncovering Temporal Information for Deep Saliency Prediction [64.63645677568384]
We introduce a novel saliency prediction model that learns to output saliency maps in sequential time intervals.
Our approach locally modulates the saliency predictions by combining the learned temporal maps.
Our code will be publicly available on GitHub.
arXiv Detail & Related papers (2023-01-05T22:10:16Z) - Neural multi-event forecasting on spatio-temporal point processes using
probabilistically enriched transformers [18.66217537327045]
Predicting discrete events in time and space has many scientific applications, such as predicting hazardous earthquakes and outbreaks of infectious diseases.
In this architecture we propose a new neural multi-event forecasting of-temporal point processes, utilizing augmented with normalizing flows and layers.
Our network makes batched predictions of complex future discrete events, achieving state-the-art performance on a variety of benchmark datasets.
arXiv Detail & Related papers (2022-11-05T14:55:36Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Variational Neural Temporal Point Process [22.396329275957996]
A temporal point process is a process that predicts which type of events is likely to happen and when the event will occur.
We introduce the inference and the generative networks, and train a distribution of latent variable to deal with property on deep neural network.
We empirically demonstrate that our model can generalize the representations of various event types.
arXiv Detail & Related papers (2022-02-17T13:34:30Z) - Bayesian Neural Hawkes Process for Event Uncertainty Prediction [0.2148535041822524]
Models for predicting time of occurrence play a significant role in a diverse set of applications like social networks, financial transactions, healthcare, and human mobility.
Recent works have introduced neural network based point process for modeling event-times, and were shown to provide state-of-the-art performance in predicting event-times.
We propose a novel point process model, Bayesian Neural Hawkes process which leverages uncertainty modelling capability of Bayesian models and generalization capability of the neural networks.
arXiv Detail & Related papers (2021-12-29T09:47:22Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Neural Conditional Event Time Models [11.920908437656413]
Event time models predict occurrence times of an event of interest based on known features.
We develop a conditional event time model that distinguishes between a) the probability of event occurrence, and b) the predicted time of occurrence.
Results demonstrate superior event occurrence and event time predictions on synthetic data, medical events (MIMIC-III), and social media posts.
arXiv Detail & Related papers (2020-04-03T05:08:13Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.