Recurrent networks improve neural response prediction and provide
insights into underlying cortical circuits
- URL: http://arxiv.org/abs/2110.00825v2
- Date: Sun, 13 Nov 2022 21:08:18 GMT
- Title: Recurrent networks improve neural response prediction and provide
insights into underlying cortical circuits
- Authors: Yimeng Zhang, Harold Rockwell, Sicheng Dai, Ge Huang, Stephen Tsou,
Yuanyuan Wei, Tai Sing Lee
- Abstract summary: CNN models have proven themselves as state-of-the-art models for predicting single-neuron responses to natural images in early visual cortical neurons.
We extend these models with recurrent convolutional layers, reflecting the well-known massive recurrence in the cortex.
We find that the hidden units in the recurrent circuits of the appropriate models, when trained on long-duration wide-field image presentations, exhibit similar temporal response dynamics and classical contextual modulations as observed in V1 neurons.
- Score: 3.340380180141713
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Feedforward CNN models have proven themselves in recent years as
state-of-the-art models for predicting single-neuron responses to natural
images in early visual cortical neurons. In this paper, we extend these models
with recurrent convolutional layers, reflecting the well-known massive
recurrence in the cortex, and show robust increases in predictive performance
over feedforward models across thousands of hyperparameter combinations in
three datasets of macaque V1 and V2 single-neuron responses. We propose the
recurrent circuit can be conceptualized as a form of ensemble computing, with
each iteration generating more effective feedforward paths of various path
lengths to allow a combination of solutions in the final approximation. The
statistics of the paths in the ensemble provide insights to the differential
performance increases among our recurrent models. We also assess whether the
recurrent circuits learned for neural response prediction can be related to
cortical circuits. We find that the hidden units in the recurrent circuits of
the appropriate models, when trained on long-duration wide-field image
presentations, exhibit similar temporal response dynamics and classical
contextual modulations as observed in V1 neurons. This work provides insights
to the computational rationale of recurrent circuits and suggests that neural
response prediction could be useful for characterizing the recurrent neural
circuits in the visual cortex.
Related papers
- Relating Superconducting Optoelectronic Networks to Classical Neurodynamics [0.0]
We present a phenomenological model of superconducting loop neurons that eliminates the need to solve the Josephson circuit equations that describe synapses and dendrites.
For some circuit parameters it is possible to represent the downstream dendritic response to a single spike as well as coincidences or sequences of spikes.
The governing equations are shown to be nearly identical to those ubiquitous in the neuroscience literature for modeling leaky-integrator dendrites and neurons.
arXiv Detail & Related papers (2024-09-26T16:23:53Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - NRTR: Neuron Reconstruction with Transformer from 3D Optical Microscopy
Images [5.724034347184251]
We propose a Neuron Reconstruction Transformer (NRTR) that views neuron reconstruction as a direct set-prediction problem.
NRTR is the first image-to-set deep learning model for end-to-end neuron reconstruction.
arXiv Detail & Related papers (2022-12-08T09:35:22Z) - Phenomenological Model of Superconducting Optoelectronic Loop Neurons [0.0]
Superconducting optoelectronic loop neurons are a class of circuits potentially conducive to networks for large-scale artificial cognition.
To date, all simulations of loop neurons have used first-principles circuit analysis to model the behavior of synapses, dendrites, and neurons.
Here we introduce a modeling framework that captures the behavior of the relevant synaptic, dendritic, and neuronal circuits.
arXiv Detail & Related papers (2022-10-18T16:38:35Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Hybrid Predictive Coding: Inferring, Fast and Slow [62.997667081978825]
We propose a hybrid predictive coding network that combines both iterative and amortized inference in a principled manner.
We demonstrate that our model is inherently sensitive to its uncertainty and adaptively balances balances to obtain accurate beliefs using minimum computational expense.
arXiv Detail & Related papers (2022-04-05T12:52:45Z) - Towards performant and reliable undersampled MR reconstruction via
diffusion model sampling [67.73698021297022]
DiffuseRecon is a novel diffusion model-based MR reconstruction method.
It guides the generation process based on the observed signals.
It does not require additional training on specific acceleration factors.
arXiv Detail & Related papers (2022-03-08T02:25:38Z) - Factorized Neural Processes for Neural Processes: $K$-Shot Prediction of
Neural Responses [9.792408261365043]
We develop a Factorized Neural Process to infer a neuron's tuning function from a small set of stimulus-response pairs.
We show on simulated responses that the predictions and reconstructed receptive fields from the Neural Process approach ground truth with increasing number of trials.
We believe this novel deep learning systems identification framework will facilitate better real-time integration of artificial neural network modeling into neuroscience experiments.
arXiv Detail & Related papers (2020-10-22T15:43:59Z) - A new inference approach for training shallow and deep generalized
linear models of noisy interacting neurons [4.899818550820575]
We develop a two-step inference strategy that allows us to train robust generalized linear models of interacting neurons.
We show that, compared to classical methods, the models trained in this way exhibit improved performance.
The method can be extended to deep convolutional neural networks, leading to models with high predictive accuracy for both the neuron firing rates and their correlations.
arXiv Detail & Related papers (2020-06-11T15:09:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.