Probabilistic Modeling for Sequences of Sets in Continuous-Time
- URL: http://arxiv.org/abs/2312.15045v3
- Date: Mon, 18 Mar 2024 21:13:26 GMT
- Title: Probabilistic Modeling for Sequences of Sets in Continuous-Time
- Authors: Yuxin Chang, Alex Boyd, Padhraic Smyth,
- Abstract summary: We develop a general framework for modeling set-valued data in continuous-time.
We also develop inference methods that can use such models to answer probabilistic queries.
- Score: 14.423456635520084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural marked temporal point processes have been a valuable addition to the existing toolbox of statistical parametric models for continuous-time event data. These models are useful for sequences where each event is associated with a single item (a single type of event or a "mark") -- but such models are not suited for the practical situation where each event is associated with a set of items. In this work, we develop a general framework for modeling set-valued data in continuous-time, compatible with any intensity-based recurrent neural point process model. In addition, we develop inference methods that can use such models to answer probabilistic queries such as "the probability of item $A$ being observed before item $B$," conditioned on sequence history. Computing exact answers for such queries is generally intractable for neural models due to both the continuous-time nature of the problem setting and the combinatorially-large space of potential outcomes for each event. To address this, we develop a class of importance sampling methods for querying with set-based sequences and demonstrate orders-of-magnitude improvements in efficiency over direct sampling via systematic experiments with four real-world datasets. We also illustrate how to use this framework to perform model selection using likelihoods that do not involve one-step-ahead prediction.
Related papers
- On the Efficient Marginalization of Probabilistic Sequence Models [3.5897534810405403]
This dissertation focuses on using autoregressive models to answer complex probabilistic queries.
We develop a class of novel and efficient approximation techniques for marginalization in sequential models that are model-agnostic.
arXiv Detail & Related papers (2024-03-06T19:29:08Z) - Deep Ensembles Meets Quantile Regression: Uncertainty-aware Imputation
for Time Series [49.992908221544624]
Time series data often exhibit numerous missing values, which is the time series imputation task.
Previous deep learning methods have been shown to be effective for time series imputation.
We propose a non-generative time series imputation method that produces accurate imputations with inherent uncertainty.
arXiv Detail & Related papers (2023-12-03T05:52:30Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Continuous-time convolutions model of event sequences [53.36665135225617]
Huge samples of event sequences data occur in various domains, including e-commerce, healthcare, and finance.
The amount of available data and the length of event sequences per client are typically large, thus it requires long-term modelling.
We propose the COTIC method based on a continuous convolution neural network suitable for non-uniform occurrence of events in time.
arXiv Detail & Related papers (2023-02-13T10:34:51Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Flexible Networks for Learning Physical Dynamics of Deformable Objects [2.567499374977917]
We propose a model named time-wise PointNet (TP-Net) to infer the future state of a deformable object with particle-based representation.
TP-Net consists of a shared feature extractor that extracts global features from each input point set in parallel and a prediction network that aggregates and reasons on these features for future prediction.
Experiments demonstrate that our model achieves state-of-the-art performance in both synthetic dataset and in real-world dataset, with real-time prediction speed.
arXiv Detail & Related papers (2021-12-07T14:34:52Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - User-Dependent Neural Sequence Models for Continuous-Time Event Data [27.45413274751265]
Continuous-time event data are common in applications such as individual behavior data, financial transactions, and medical health records.
Recurrent neural networks that parameterize time-varying intensity functions are the current state-of-the-art for predictive modeling with such data.
In this paper, we extend the broad class of neural marked point process models to mixtures of latent embeddings.
arXiv Detail & Related papers (2020-11-06T08:32:57Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.