Representation learning for neural population activity with Neural Data
Transformers
- URL: http://arxiv.org/abs/2108.01210v1
- Date: Mon, 2 Aug 2021 23:36:39 GMT
- Title: Representation learning for neural population activity with Neural Data
Transformers
- Authors: Joel Ye, Chethan Pandarinath
- Abstract summary: We introduce the Neural Data Transformer (NDT), a non-recurrent alternative to explicit dynamics models.
NDT enables 3.9ms inference, well within the loop time of real-time applications and more than 6 times faster than recurrent baselines.
- Score: 3.4376560669160394
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Neural population activity is theorized to reflect an underlying dynamical
structure. This structure can be accurately captured using state space models
with explicit dynamics, such as those based on recurrent neural networks
(RNNs). However, using recurrence to explicitly model dynamics necessitates
sequential processing of data, slowing real-time applications such as
brain-computer interfaces. Here we introduce the Neural Data Transformer (NDT),
a non-recurrent alternative. We test the NDT's ability to capture autonomous
dynamical systems by applying it to synthetic datasets with known dynamics and
data from monkey motor cortex during a reaching task well-modeled by RNNs. The
NDT models these datasets as well as state-of-the-art recurrent models.
Further, its non-recurrence enables 3.9ms inference, well within the loop time
of real-time applications and more than 6 times faster than recurrent baselines
on the monkey reaching dataset. These results suggest that an explicit dynamics
model is not necessary to model autonomous neural population dynamics. Code:
https://github.com/snel-repo/neural-data-transformers
Related papers
- A frugal Spiking Neural Network for unsupervised classification of continuous multivariate temporal data [0.0]
Spiking Neural Networks (SNNs) are neuromorphic and use more biologically plausible neurons with evolving membrane potentials.
We introduce here a frugal single-layer SNN designed for fully unsupervised identification and classification of multivariate temporal patterns in continuous data.
arXiv Detail & Related papers (2024-08-08T08:15:51Z) - Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.
Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.
Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - STNDT: Modeling Neural Population Activity with a Spatiotemporal
Transformer [19.329190789275565]
We introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons.
We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets.
arXiv Detail & Related papers (2022-06-09T18:54:23Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Analytically Integratable Zero-restlength Springs for Capturing Dynamic
Modes unrepresented by Quasistatic Neural Networks [6.601755525003559]
We present a novel paradigm for modeling certain types of dynamic simulation in real-time with the aid of neural networks.
We augment our quasistatic neural network (QNN) inference with a (real-time) dynamic simulation layer.
We demonstrate that the spring parameters can be robustly learned from a surprisingly small amount of dynamic simulation data.
arXiv Detail & Related papers (2022-01-25T06:44:15Z) - Deep inference of latent dynamics with spatio-temporal super-resolution
using selective backpropagation through time [15.648009434801885]
Modern neural interfaces allow access to the activity of up to a million neurons within brain circuits.
bandwidth limits often create a trade-off between greater spatial sampling (more channels or pixels) and frequency of temporal sampling.
Here we demonstrate that it is possible to obtain super-resolution in neuronal time series by exploiting relationships among neurons.
arXiv Detail & Related papers (2021-10-29T20:18:29Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.