Neural Spatio-Temporal Point Processes
- URL: http://arxiv.org/abs/2011.04583v3
- Date: Thu, 18 Mar 2021 00:00:23 GMT
- Title: Neural Spatio-Temporal Point Processes
- Authors: Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel
- Abstract summary: We propose a new class of parameterizations for point-trivial processes which leverage Neural ODEs as a computational method.
We validate our models on data sets from a wide variety of contexts such as seismology, epidemiology, urban mobility, and neuroscience.
- Score: 31.474420819149724
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new class of parameterizations for spatio-temporal point
processes which leverage Neural ODEs as a computational method and enable
flexible, high-fidelity models of discrete events that are localized in
continuous time and space. Central to our approach is a combination of
continuous-time neural networks with two novel neural architectures, i.e., Jump
and Attentive Continuous-time Normalizing Flows. This approach allows us to
learn complex distributions for both the spatial and temporal domain and to
condition non-trivially on the observed event history. We validate our models
on data sets from a wide variety of contexts such as seismology, epidemiology,
urban mobility, and neuroscience.
Related papers
- Dense ReLU Neural Networks for Temporal-spatial Model [13.8173644075917]
We focus on fully connected deep neural networks utilizing the Rectified Linear Unit (ReLU) activation function for nonparametric estimation.
We derive non-asymptotic bounds that lead to convergence rates, addressing both temporal and spatial dependence in the observed measurements.
We also tackle the curse of dimensionality by modeling the data on a manifold, exploring the intrinsic dimensionality of high-dimensional data.
arXiv Detail & Related papers (2024-11-15T05:30:36Z) - Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Learning Time-Varying Multi-Region Communications via Scalable Markovian Gaussian Processes [2.600709013150986]
We present a novel framework using Markovian Gaussian Processes to learn brain communications with time-varying temporal delays.
This work advances our understanding of distributed neural computation and provides a scalable tool for analyzing dynamic brain networks.
arXiv Detail & Related papers (2024-06-29T10:50:23Z) - Learning Spatiotemporal Dynamical Systems from Point Process Observations [7.381752536547389]
Current neural network-based model approaches fall short when faced with data that is collected randomly over time and space.
In response, we developed a new method that can effectively learn from such process observations.
Our model integrates techniques from neural differential equations, neural point processes, implicit neural representations and amortized variational inference.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - Neural Ordinary Differential Equation Model for Evolutionary Subspace
Clustering and Its Applications [36.700813256689656]
We propose a neural ODE model for evolutionary subspace clustering to overcome this limitation.
We demonstrate that this method can not only interpolate data at any time step for the evolutionary subspace clustering task, but also achieve higher accuracy than other state-of-the-art methods.
arXiv Detail & Related papers (2021-07-22T07:02:03Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.