Counterfactual Temporal Point Processes
- URL: http://arxiv.org/abs/2111.07603v1
- Date: Mon, 15 Nov 2021 08:46:25 GMT
- Title: Counterfactual Temporal Point Processes
- Authors: Kimia Noorbakhsh and Manuel Gomez Rodriguez
- Abstract summary: We develop a causal model of thinning for temporal point processes that builds upon the Gumbel-Max structural causal model.
We then simulate counterfactual realizations of the temporal point process under a given alternative intensity function.
- Score: 18.37409880250174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning models based on temporal point processes are the state of
the art in a wide variety of applications involving discrete events in
continuous time. However, these models lack the ability to answer
counterfactual questions, which are increasingly relevant as these models are
being used to inform targeted interventions. In this work, our goal is to fill
this gap. To this end, we first develop a causal model of thinning for temporal
point processes that builds upon the Gumbel-Max structural causal model. This
model satisfies a desirable counterfactual monotonicity condition, which is
sufficient to identify counterfactual dynamics in the process of thinning.
Then, given an observed realization of a temporal point process with a given
intensity function, we develop a sampling algorithm that uses the above causal
model of thinning and the superposition theorem to simulate counterfactual
realizations of the temporal point process under a given alternative intensity
function. Simulation experiments using synthetic and real epidemiological data
show that the counterfactual realizations provided by our algorithm may give
valuable insights to enhance targeted interventions.
Related papers
- Neural Persistence Dynamics [8.197801260302642]
We consider the problem of learning the dynamics in the topology of time-evolving point clouds.
Our proposed model -- staticitneural persistence dynamics -- substantially outperforms the state-of-the-art across a diverse set of parameter regression tasks.
arXiv Detail & Related papers (2024-05-24T17:20:18Z) - On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - Neural Likelihood Approximation for Integer Valued Time Series Data [0.0]
We construct a neural likelihood approximation that can be trained using unconditional simulation of the underlying model.
We demonstrate our method by performing inference on a number of ecological and epidemiological models.
arXiv Detail & Related papers (2023-10-19T07:51:39Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models [2.7391842773173334]
We develop a simulation-based deep learning method for Bayesian inference, which can recover both time-varying and time-invariant parameters.
Our results show that the deep learning approach is very efficient in capturing the temporal dynamics of the model.
arXiv Detail & Related papers (2022-11-23T17:42:53Z) - Mining Causality from Continuous-time Dynamics Models: An Application to
Tsunami Forecasting [22.434845478979604]
We propose a mechanism for mining causal structures from continuous-time models.
We train models to capture the causal structure by enforcing sparsity in the weights of the input layers of the dynamics models.
We apply our method to a real-world problem, namely tsunami forecasting, where the exact causal-structures are difficult to characterize.
arXiv Detail & Related papers (2022-10-10T18:53:13Z) - Temporal Relevance Analysis for Video Action Models [70.39411261685963]
We first propose a new approach to quantify the temporal relationships between frames captured by CNN-based action models.
We then conduct comprehensive experiments and in-depth analysis to provide a better understanding of how temporal modeling is affected.
arXiv Detail & Related papers (2022-04-25T19:06:48Z) - DriPP: Driven Point Processes to Model Stimuli Induced Patterns in M/EEG
Signals [62.997667081978825]
We develop a novel statistical point process model-called driven temporal point processes (DriPP)
We derive a fast and principled expectation-maximization (EM) algorithm to estimate the parameters of this model.
Results on standard MEG datasets demonstrate that our methodology reveals event-related neural responses.
arXiv Detail & Related papers (2021-12-08T13:07:21Z) - Efficient hierarchical Bayesian inference for spatio-temporal regression
models in neuroimaging [6.512092052306553]
Examples include M/EEG inverse problems, encoding neural models for task-based fMRI analyses, and temperature monitoring schemes.
We devise a novel hierarchical flexible Bayesian framework within which the intrinsic-temporal dynamics of model parameters and noise are modeled.
arXiv Detail & Related papers (2021-11-02T15:50:01Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.