The Butterfly Effect in Primary Visual Cortex
- URL: http://arxiv.org/abs/2104.07257v2
- Date: Sat, 23 Jul 2022 04:19:59 GMT
- Title: The Butterfly Effect in Primary Visual Cortex
- Authors: Jizhao Liu, Jing Lian, J C Sprott, Qidong Liu, Yide Ma
- Abstract summary: We propose a novel neural network, called continuous-coupled neural network (CCNN)
Numerical results show that the CCNN model exhibits periodic behavior under DC stimulus, and exhibits chaotic behavior under AC stimulus.
Experimental results on image segmentation indicate that the CCNN model has better performance than the state-of-the-art of visual cortex neural network models.
- Score: 5.954654488330137
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Exploring and establishing artificial neural networks with
electrophysiological characteristics and high computational efficiency is a
popular topic in the field of computer vision. Inspired by the working
mechanism of primary visual cortex, pulse-coupled neural network (PCNN) can
exhibit the characteristics of synchronous oscillation, refractory period, and
exponential decay. However, electrophysiological evidence shows that the
neurons exhibit highly complex non-linear dynamics when stimulated by external
periodic signals. This chaos phenomenon, also known as the " butterfly effect",
cannot be explained by all PCNN models. In this work, we analyze the main
obstacle preventing PCNN models from imitating real primary visual cortex. We
consider neuronal excitation as a stochastic process. We then propose a novel
neural network, called continuous-coupled neural network (CCNN). Theoretical
analysis indicates that the dynamic behavior of CCNN is distinct from PCNN.
Numerical results show that the CCNN model exhibits periodic behavior under DC
stimulus, and exhibits chaotic behavior under AC stimulus, which is consistent
with the results of real neurons. Furthermore, the image and video processing
mechanisms of the CCNN model are analyzed. Experimental results on image
segmentation indicate that the CCNN model has better performance than the
state-of-the-art of visual cortex neural network models.
Related papers
- Understanding Artificial Neural Network's Behavior from Neuron Activation Perspective [8.251799609350725]
This paper explores the intricate behavior of deep neural networks (DNNs) through the lens of neuron activation dynamics.
We propose a probabilistic framework that can analyze models' neuron activation patterns as a process.
arXiv Detail & Related papers (2024-12-24T01:01:06Z) - TAVRNN: Temporal Attention-enhanced Variational Graph RNN Captures Neural Dynamics and Behavior [2.5282283486446757]
We introduce Temporal Attention-enhanced Variational Graph Recurrent Neural Network (TAVRNN)
TAVRNN captures temporal changes in network structure by modeling sequential snapshots of neuronal activity.
We show that TAVRNN outperforms previous baseline models in classification, clustering tasks and computational efficiency.
arXiv Detail & Related papers (2024-10-01T13:19:51Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Random-coupled Neural Network [17.53731608985241]
Pulse-coupled neural network (PCNN) is a well applicated model for imitating the characteristics of the human brain in computer vision and neural network fields.
In this study, random-coupled neural network (RCNN) is proposed.
It overcomes difficulties in PCNN's neuromorphic computing via a random inactivation process.
arXiv Detail & Related papers (2024-03-26T09:13:06Z) - Deep Pulse-Coupled Neural Networks [31.65350290424234]
Neural Networks (SNNs) capture the information processing mechanism of the brain by taking advantage of neurons.
In this work, we leverage a more biologically plausible neural model with complex dynamics, i.e., a pulse-coupled neural network (PCNN)
We construct deep pulse-coupled neural networks (DPCNNs) by replacing commonly used LIF neurons in SNNs with PCNN neurons.
arXiv Detail & Related papers (2023-12-24T08:26:00Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.