Generalization of generative model for neuronal ensemble inference
method
- URL: http://arxiv.org/abs/2211.05634v3
- Date: Tue, 27 Jun 2023 20:22:13 GMT
- Title: Generalization of generative model for neuronal ensemble inference
method
- Authors: Shun Kimura, Koujin Takeda
- Abstract summary: In this study, we extend the range of the variable for expressing the neuronal state, and generalize the likelihood of the model for extended variables.
This generalization without restriction of the binary input enables us to perform soft clustering and apply the method to non-stationary neuroactivity data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Various brain functions that are necessary to maintain life activities
materialize through the interaction of countless neurons. Therefore, it is
important to analyze functional neuronal network. To elucidate the mechanism of
brain function, many studies are being actively conducted on functional
neuronal ensemble and hub, including all areas of neuroscience. In addition,
recent study suggests that the existence of functional neuronal ensembles and
hubs contributes to the efficiency of information processing. For these
reasons, there is a demand for methods to infer functional neuronal ensembles
from neuronal activity data, and methods based on Bayesian inference have been
proposed. However, there is a problem in modeling the activity in Bayesian
inference. The features of each neuron's activity have non-stationarity
depending on physiological experimental conditions. As a result, the assumption
of stationarity in Bayesian inference model impedes inference, which leads to
destabilization of inference results and degradation of inference accuracy. In
this study, we extend the range of the variable for expressing the neuronal
state, and generalize the likelihood of the model for extended variables. By
comparing with the previous study, our model can express the neuronal state in
larger space. This generalization without restriction of the binary input
enables us to perform soft clustering and apply the method to non-stationary
neuroactivity data. In addition, for the effectiveness of the method, we apply
the developed method to multiple synthetic fluorescence data generated from the
electrical potential data in leaky integrated-and-fire model.
Related papers
- SynapsNet: Enhancing Neuronal Population Dynamics Modeling via Learning Functional Connectivity [0.0]
We introduce SynapsNet, a novel deep-learning framework that effectively models population dynamics and functional interactions between neurons.
A shared decoder uses the input current, previous neuronal activity, neuron embedding, and behavioral data to predict the population activity in the next time step.
Our experiments, conducted on mouse cortical activity from publicly available datasets, demonstrate that SynapsNet consistently outperforms existing models in forecasting population activity.
arXiv Detail & Related papers (2024-11-12T22:25:15Z) - Modeling dynamic neural activity by combining naturalistic video stimuli and stimulus-independent latent factors [5.967290675400836]
We propose a probabilistic model that incorporates video inputs along with stimulus-independent latent factors to capture variability in neuronal responses.
After training and testing our model on mouse V1 neuronal responses, we found that it outperforms video-only models in terms of log-likelihood.
We find that the learned latent factors strongly correlate with mouse behavior, although the model was trained without behavior data.
arXiv Detail & Related papers (2024-10-21T16:01:39Z) - Exploring Behavior-Relevant and Disentangled Neural Dynamics with Generative Diffusion Models [2.600709013150986]
Understanding the neural basis of behavior is a fundamental goal in neuroscience.
Our approach, named BeNeDiff'', first identifies a fine-grained and disentangled neural subspace.
It then employs state-of-the-art generative diffusion models to synthesize behavior videos that interpret the neural dynamics of each latent factor.
arXiv Detail & Related papers (2024-10-12T18:28:56Z) - BLEND: Behavior-guided Neural Population Dynamics Modeling via Privileged Knowledge Distillation [6.3559178227943764]
We propose BLEND, a behavior-guided neural population dynamics modeling framework via privileged knowledge distillation.
By considering behavior as privileged information, we train a teacher model that takes both behavior observations (privileged features) and neural activities (regular features) as inputs.
A student model is then distilled using only neural activity.
arXiv Detail & Related papers (2024-10-02T12:45:59Z) - Neuroformer: Multimodal and Multitask Generative Pretraining for Brain Data [3.46029409929709]
State-of-the-art systems neuroscience experiments yield large-scale multimodal data, and these data sets require new tools for analysis.
Inspired by the success of large pretrained models in vision and language domains, we reframe the analysis of large-scale, cellular-resolution neuronal spiking data into an autoregressive generation problem.
We first trained Neuroformer on simulated datasets, and found that it both accurately predicted intrinsically simulated neuronal circuit activity, and also inferred the underlying neural circuit connectivity, including direction.
arXiv Detail & Related papers (2023-10-31T20:17:32Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - Overcoming the Domain Gap in Contrastive Learning of Neural Action
Representations [60.47807856873544]
A fundamental goal in neuroscience is to understand the relationship between neural activity and behavior.
We generated a new multimodal dataset consisting of the spontaneous behaviors generated by fruit flies.
This dataset and our new set of augmentations promise to accelerate the application of self-supervised learning methods in neuroscience.
arXiv Detail & Related papers (2021-11-29T15:27:51Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.