Inference and Sampling of Point Processes from Diffusion Excursions
- URL: http://arxiv.org/abs/2306.00762v1
- Date: Thu, 1 Jun 2023 14:56:23 GMT
- Title: Inference and Sampling of Point Processes from Diffusion Excursions
- Authors: Ali Hasan, Yu Chen, Yuting Ng, Mohamed Abdelghani, Anderson Schneider,
Vahid Tarokh
- Abstract summary: We propose a point process construction that describes arrival time observations in terms of the state of a latent diffusion process.
Based on the developments in Ito's excursion theory, we propose methods for inferring and sampling from the point process derived from the latent diffusion process.
- Score: 26.111388335046197
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Point processes often have a natural interpretation with respect to a
continuous process. We propose a point process construction that describes
arrival time observations in terms of the state of a latent diffusion process.
In this framework, we relate the return times of a diffusion in a continuous
path space to new arrivals of the point process. This leads to a continuous
sample path that is used to describe the underlying mechanism generating the
arrival distribution. These models arise in many disciplines, such as financial
settings where actions in a market are determined by a hidden continuous price
or in neuroscience where a latent stimulus generates spike trains. Based on the
developments in It\^o's excursion theory, we propose methods for inferring and
sampling from the point process derived from the latent diffusion process. We
illustrate the approach with numerical examples using both simulated and real
data. The proposed methods and framework provide a basis for interpreting point
processes through the lens of diffusions.
Related papers
- Unlocking Point Processes through Point Set Diffusion [42.96573032954792]
We introduce Point Set Diffusion, a point diffusion-based latent variable model that can represent arbitrary processes on general metric spaces without relying on the intensity function.
Our approach enables efficient, parallel sampling and flexible generation for complex conditional tasks defined on the metric space.
arXiv Detail & Related papers (2024-10-29T19:33:18Z) - Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Bridging discrete and continuous state spaces: Exploring the Ehrenfest process in time-continuous diffusion models [4.186575888568896]
We study time-continuous Markov jump processes on discrete state spaces.
We show that the time-reversal of the Ehrenfest process converges to the time-reversed Ornstein-Uhlenbeck process.
arXiv Detail & Related papers (2024-05-06T15:12:51Z) - Non-Denoising Forward-Time Diffusions [4.831663144935879]
We show that the time-reversal argument, common to all denoising diffusion probabilistic modeling proposals, is not necessary.
We obtain diffusion processes targeting the desired data distribution by taking appropriate mixtures of diffusion bridges.
We develop a unifying view of the drift adjustments corresponding to our and to time-reversal approaches.
arXiv Detail & Related papers (2023-12-22T10:26:31Z) - Fast Sampling via Discrete Non-Markov Diffusion Models [49.598085130313514]
We propose a discrete non-Markov diffusion model, which admits an accelerated reverse sampling for discrete data generation.
Our method significantly reduces the number of function evaluations (i.e., calls to the neural network), making the sampling process much faster.
arXiv Detail & Related papers (2023-12-14T18:14:11Z) - Blackout Diffusion: Generative Diffusion Models in Discrete-State Spaces [0.0]
We develop a theoretical formulation for arbitrary discrete-state Markov processes in the forward diffusion process.
As an example, we introduce Blackout Diffusion'', which learns to produce samples from an empty image instead of from noise.
arXiv Detail & Related papers (2023-05-18T16:24:12Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Conditioning Normalizing Flows for Rare Event Sampling [61.005334495264194]
We propose a transition path sampling scheme based on neural-network generated configurations.
We show that this approach enables the resolution of both the thermodynamics and kinetics of the transition region.
arXiv Detail & Related papers (2022-07-29T07:56:10Z) - Stochastic Trajectory Prediction via Motion Indeterminacy Diffusion [88.45326906116165]
We present a new framework to formulate the trajectory prediction task as a reverse process of motion indeterminacy diffusion (MID)
We encode the history behavior information and the social interactions as a state embedding and devise a Transformer-based diffusion model to capture the temporal dependencies of trajectories.
Experiments on the human trajectory prediction benchmarks including the Stanford Drone and ETH/UCY datasets demonstrate the superiority of our method.
arXiv Detail & Related papers (2022-03-25T16:59:08Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Inference, Prediction, and Entropy-Rate Estimation of Continuous-time,
Discrete-event Processes [0.0]
Inferring models, predicting the future, and estimating the entropy rate of discrete-time, discrete-event processes is well-worn ground.
Here, we provide new methods for inferring, predicting, and estimating them.
arXiv Detail & Related papers (2020-05-07T20:54:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.