Point process models for sequence detection in high-dimensional neural
spike trains
- URL: http://arxiv.org/abs/2010.04875v1
- Date: Sat, 10 Oct 2020 02:21:44 GMT
- Title: Point process models for sequence detection in high-dimensional neural
spike trains
- Authors: Alex H. Williams, Anthony Degleris, Yixin Wang, Scott W. Linderman
- Abstract summary: We develop a point process model that characterizes fine-scale sequences at the level of individual spikes.
This ultra-sparse representation of sequence events opens new possibilities for spike train modeling.
- Score: 29.073129195368235
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Sparse sequences of neural spikes are posited to underlie aspects of working
memory, motor production, and learning. Discovering these sequences in an
unsupervised manner is a longstanding problem in statistical neuroscience.
Promising recent work utilized a convolutive nonnegative matrix factorization
model to tackle this challenge. However, this model requires spike times to be
discretized, utilizes a sub-optimal least-squares criterion, and does not
provide uncertainty estimates for model predictions or estimated parameters. We
address each of these shortcomings by developing a point process model that
characterizes fine-scale sequences at the level of individual spikes and
represents sequence occurrences as a small number of marked events in
continuous time. This ultra-sparse representation of sequence events opens new
possibilities for spike train modeling. For example, we introduce learnable
time warping parameters to model sequences of varying duration, which have been
experimentally observed in neural circuits. We demonstrate these advantages on
experimental recordings from songbird higher vocal center and rodent
hippocampus.
Related papers
- Finding the DeepDream for Time Series: Activation Maximization for Univariate Time Series [10.388704631887496]
We introduce Sequence Dreaming, a technique that adapts Maxim Activationization to analyze sequential information.
We visualize the temporal dynamics and patterns most influential in model decision-making processes.
arXiv Detail & Related papers (2024-08-20T08:09:44Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - DriPP: Driven Point Processes to Model Stimuli Induced Patterns in M/EEG
Signals [62.997667081978825]
We develop a novel statistical point process model-called driven temporal point processes (DriPP)
We derive a fast and principled expectation-maximization (EM) algorithm to estimate the parameters of this model.
Results on standard MEG datasets demonstrate that our methodology reveals event-related neural responses.
arXiv Detail & Related papers (2021-12-08T13:07:21Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - A Bayesian Perspective on Training Speed and Model Selection [51.15664724311443]
We show that a measure of a model's training speed can be used to estimate its marginal likelihood.
We verify our results in model selection tasks for linear models and for the infinite-width limit of deep neural networks.
Our results suggest a promising new direction towards explaining why neural networks trained with gradient descent are biased towards functions that generalize well.
arXiv Detail & Related papers (2020-10-27T17:56:14Z) - Rescuing neural spike train models from bad MLE [9.655669944739127]
We propose to directly minimize the divergence between neural recorded and model generated spike trains using spike train kernels.
We show that we can control the trade-off between different features which is critical for dealing with model-mismatch.
arXiv Detail & Related papers (2020-10-23T12:46:12Z) - Variational inference formulation for a model-free simulation of a
dynamical system with unknown parameters by a recurrent neural network [8.616180927172548]
We propose a "model-free" simulation of a dynamical system with unknown parameters without prior knowledge.
The deep learning model aims to jointly learn the nonlinear time marching operator and the effects of the unknown parameters from a time series dataset.
It is found that the proposed deep learning model is capable of correctly identifying the dimensions of the random parameters and learning a representation of complex time series data.
arXiv Detail & Related papers (2020-03-02T20:57:02Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.