Robust alignment of cross-session recordings of neural population
activity by behaviour via unsupervised domain adaptation
- URL: http://arxiv.org/abs/2202.06159v2
- Date: Wed, 16 Feb 2022 14:13:59 GMT
- Title: Robust alignment of cross-session recordings of neural population
activity by behaviour via unsupervised domain adaptation
- Authors: Justin Jude, Matthew G Perich, Lee E Miller, Matthias H Hennig
- Abstract summary: We introduce a model capable of inferring behaviourally relevant latent dynamics from previously unseen data recorded from the same animal.
We show that unsupervised domain adaptation combined with a sequential variational autoencoder, trained on several sessions, can achieve good generalisation to unseen data.
- Score: 1.2617078020344619
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural population activity relating to behaviour is assumed to be inherently
low-dimensional despite the observed high dimensionality of data recorded using
multi-electrode arrays. Therefore, predicting behaviour from neural population
recordings has been shown to be most effective when using latent variable
models. Over time however, the activity of single neurons can drift, and
different neurons will be recorded due to movement of implanted neural probes.
This means that a decoder trained to predict behaviour on one day performs
worse when tested on a different day. On the other hand, evidence suggests that
the latent dynamics underlying behaviour may be stable even over months and
years. Based on this idea, we introduce a model capable of inferring
behaviourally relevant latent dynamics from previously unseen data recorded
from the same animal, without any need for decoder recalibration. We show that
unsupervised domain adaptation combined with a sequential variational
autoencoder, trained on several sessions, can achieve good generalisation to
unseen data and correctly predict behaviour where conventional methods fail.
Our results further support the hypothesis that behaviour-related neural
dynamics are low-dimensional and stable over time, and will enable more
effective and flexible use of brain computer interface technologies.
Related papers
- Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - STNDT: Modeling Neural Population Activity with a Spatiotemporal
Transformer [19.329190789275565]
We introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons.
We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets.
arXiv Detail & Related papers (2022-06-09T18:54:23Z) - Capturing cross-session neural population variability through
self-supervised identification of consistent neuron ensembles [1.2617078020344619]
We show that self-supervised training of a deep neural network can be used to compensate for inter-session variability.
A sequential autoencoding model can maintain state-of-the-art behaviour decoding performance for completely unseen recording sessions several days into the future.
arXiv Detail & Related papers (2022-05-19T20:00:33Z) - Learnable latent embeddings for joint behavioral and neural analysis [3.6062449190184136]
We show that CEBRA can be used for the mapping of space, uncovering complex kinematic features, and rapid, high-accuracy decoding of natural movies from visual cortex.
We validate its accuracy and demonstrate its utility for both calcium and electrophysiology datasets, across sensory and motor tasks, and in simple or complex behaviors across species.
arXiv Detail & Related papers (2022-04-01T19:19:33Z) - Overcoming the Domain Gap in Neural Action Representations [60.47807856873544]
3D pose data can now be reliably extracted from multi-view video sequences without manual intervention.
We propose to use it to guide the encoding of neural action representations together with a set of neural and behavioral augmentations.
To reduce the domain gap, during training, we swap neural and behavioral data across animals that seem to be performing similar actions.
arXiv Detail & Related papers (2021-12-02T12:45:46Z) - Overcoming the Domain Gap in Contrastive Learning of Neural Action
Representations [60.47807856873544]
A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior.
We generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies.
This dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.
arXiv Detail & Related papers (2021-11-29T15:27:51Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Bubblewrap: Online tiling and real-time flow prediction on neural
manifolds [2.624902795082451]
We propose a method that combines fast, stable dimensionality reduction with a soft tiling of the resulting neural manifold.
The resulting model can be trained at kiloHertz data rates, produces accurate approximations of neural dynamics within minutes, and generates predictions on submillisecond time scales.
arXiv Detail & Related papers (2021-08-31T16:01:45Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Investigating naturalistic hand movements by behavior mining in
long-term video and neural recordings [1.7205106391379024]
We describe an automated approach for analyzing simultaneously recorded long-term, naturalistic electrocorticography (ECoG) and naturalistic behavior video data.
We show results from our approach applied to data collected for 12 human subjects over 7--9 days for each subject.
Our pipeline discovers and annotates over 40,000 instances of naturalistic human upper-limb movement events in the behavioral videos.
arXiv Detail & Related papers (2020-01-23T02:41:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.