Neural Spatio-Temporal Point Processes
- URL: http://arxiv.org/abs/2011.04583v3
- Date: Thu, 18 Mar 2021 00:00:23 GMT
- Title: Neural Spatio-Temporal Point Processes
- Authors: Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel
- Abstract summary: We propose a new class of parameterizations for point-trivial processes which leverage Neural ODEs as a computational method.
We validate our models on data sets from a wide variety of contexts such as seismology, epidemiology, urban mobility, and neuroscience.
- Score: 31.474420819149724
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new class of parameterizations for spatio-temporal point
processes which leverage Neural ODEs as a computational method and enable
flexible, high-fidelity models of discrete events that are localized in
continuous time and space. Central to our approach is a combination of
continuous-time neural networks with two novel neural architectures, i.e., Jump
and Attentive Continuous-time Normalizing Flows. This approach allows us to
learn complex distributions for both the spatial and temporal domain and to
condition non-trivially on the observed event history. We validate our models
on data sets from a wide variety of contexts such as seismology, epidemiology,
urban mobility, and neuroscience.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Modeling Randomly Observed Spatiotemporal Dynamical Systems [7.381752536547389]
Currently available neural network-based modeling approaches fall short when faced with data collected randomly over time and space.
In response, we developed a new method that effectively handles such randomly sampled data.
Our model integrates techniques from amortized variational inference, neural differential equations, neural point processes, and implicit neural representations to predict both the dynamics of the system and the timings and locations of future observations.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - An Adaptive Federated Relevance Framework for Spatial Temporal Graph
Learning [14.353798949041698]
We propose an adaptive federated relevance framework, namely FedRel, for spatial-temporal graph learning.
The core Dynamic Inter-Intra Graph (DIIG) module in the framework is able to use these features to generate the spatial-temporal graphs.
To improve the model generalization ability and performance while preserving the local data privacy, we also design a relevance-driven federated learning module.
arXiv Detail & Related papers (2022-06-07T16:12:17Z) - Neural Point Process for Learning Spatiotemporal Event Dynamics [21.43984242938217]
We propose a deep dynamics model that integratestemporal point processes.
Our method is flexible, efficient and can accurately forecast irregularly sampled events over space and time.
On real-world benchmarks, our model demonstrates superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2021-12-12T23:17:33Z) - Neural Ordinary Differential Equation Model for Evolutionary Subspace
Clustering and Its Applications [36.700813256689656]
We propose a neural ODE model for evolutionary subspace clustering to overcome this limitation.
We demonstrate that this method can not only interpolate data at any time step for the evolutionary subspace clustering task, but also achieve higher accuracy than other state-of-the-art methods.
arXiv Detail & Related papers (2021-07-22T07:02:03Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.