Capturing cross-session neural population variability through
self-supervised identification of consistent neuron ensembles
- URL: http://arxiv.org/abs/2205.09829v1
- Date: Thu, 19 May 2022 20:00:33 GMT
- Title: Capturing cross-session neural population variability through
self-supervised identification of consistent neuron ensembles
- Authors: Justin Jude, Matthew G. Perich, Lee E. Miller, Matthias H. Hennig
- Abstract summary: We show that self-supervised training of a deep neural network can be used to compensate for inter-session variability.
A sequential autoencoding model can maintain state-of-the-art behaviour decoding performance for completely unseen recording sessions several days into the future.
- Score: 1.2617078020344619
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Decoding stimuli or behaviour from recorded neural activity is a common
approach to interrogate brain function in research, and an essential part of
brain-computer and brain-machine interfaces. Reliable decoding even from small
neural populations is possible because high dimensional neural population
activity typically occupies low dimensional manifolds that are discoverable
with suitable latent variable models. Over time however, drifts in activity of
individual neurons and instabilities in neural recording devices can be
substantial, making stable decoding over days and weeks impractical. While this
drift cannot be predicted on an individual neuron level, population level
variations over consecutive recording sessions such as differing sets of
neurons and varying permutations of consistent neurons in recorded data may be
learnable when the underlying manifold is stable over time. Classification of
consistent versus unfamiliar neurons across sessions and accounting for
deviations in the order of consistent recording neurons in recording datasets
over sessions of recordings may then maintain decoding performance. In this
work we show that self-supervised training of a deep neural network can be used
to compensate for this inter-session variability. As a result, a sequential
autoencoding model can maintain state-of-the-art behaviour decoding performance
for completely unseen recording sessions several days into the future. Our
approach only requires a single recording session for training the model, and
is a step towards reliable, recalibration-free brain computer interfaces.
Related papers
- Learning Time-Invariant Representations for Individual Neurons from
Population Dynamics [29.936569965875375]
We propose a self-supervised learning based method to assign time-invariant representations to individual neurons.
We fit dynamical models to neuronal activity to learn a representation by considering the activity of both the individual and the neighboring population.
We demonstrate our method on a public multimodal dataset of mouse cortical neuronal activity and transcriptomic labels.
arXiv Detail & Related papers (2023-11-03T22:30:12Z) - Neuroformer: Multimodal and Multitask Generative Pretraining for Brain Data [3.46029409929709]
State-of-the-art systems neuroscience experiments yield large-scale multimodal data, and these data sets require new tools for analysis.
Inspired by the success of large pretrained models in vision and language domains, we reframe the analysis of large-scale, cellular-resolution neuronal spiking data into an autoregressive generation problem.
We first trained Neuroformer on simulated datasets, and found that it both accurately predicted intrinsically simulated neuronal circuit activity, and also inferred the underlying neural circuit connectivity, including direction.
arXiv Detail & Related papers (2023-10-31T20:17:32Z) - A Unified, Scalable Framework for Neural Population Decoding [12.052847252465826]
We introduce a training framework and architecture designed to model the population dynamics of neural activity.
We construct a large-scale multi-session model trained on large datasets from seven nonhuman primates.
arXiv Detail & Related papers (2023-10-24T17:58:26Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - STNDT: Modeling Neural Population Activity with a Spatiotemporal
Transformer [19.329190789275565]
We introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons.
We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets.
arXiv Detail & Related papers (2022-06-09T18:54:23Z) - Robust alignment of cross-session recordings of neural population
activity by behaviour via unsupervised domain adaptation [1.2617078020344619]
We introduce a model capable of inferring behaviourally relevant latent dynamics from previously unseen data recorded from the same animal.
We show that unsupervised domain adaptation combined with a sequential variational autoencoder, trained on several sessions, can achieve good generalisation to unseen data.
arXiv Detail & Related papers (2022-02-12T22:17:30Z) - Overcoming the Domain Gap in Contrastive Learning of Neural Action
Representations [60.47807856873544]
A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior.
We generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies.
This dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.
arXiv Detail & Related papers (2021-11-29T15:27:51Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Astrocytes mediate analogous memory in a multi-layer neuron-astrocytic
network [52.77024349608834]
We show how a piece of information can be maintained as a robust activity pattern for several seconds then completely disappear if no other stimuli come.
This kind of short-term memory can keep operative information for seconds, then completely forget it to avoid overlapping with forthcoming patterns.
We show how arbitrary patterns can be loaded, then stored for a certain interval of time, and retrieved if the appropriate clue pattern is applied to the input.
arXiv Detail & Related papers (2021-08-31T16:13:15Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.