Connected Hidden Neurons (CHNNet): An Artificial Neural Network for
Rapid Convergence
- URL: http://arxiv.org/abs/2305.10468v2
- Date: Sun, 24 Sep 2023 08:06:07 GMT
- Title: Connected Hidden Neurons (CHNNet): An Artificial Neural Network for
Rapid Convergence
- Authors: Rafiad Sadat Shahir, Zayed Humayun, Mashrufa Akter Tamim, Shouri Saha,
Md. Golam Rabiul Alam
- Abstract summary: We propose a more robust model of artificial neural networks where the hidden neurons, residing in the same hidden layer, are interconnected that leads to rapid convergence.
With the experimental study of our proposed model in deep networks, we demonstrate that the model results in a noticeable increase in convergence rate compared to the conventional feed-forward neural network.
- Score: 0.6218519716921521
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Despite artificial neural networks being inspired by the functionalities of
biological neural networks, unlike biological neural networks, conventional
artificial neural networks are often structured hierarchically, which can
impede the flow of information between neurons as the neurons in the same layer
have no connections between them. Hence, we propose a more robust model of
artificial neural networks where the hidden neurons, residing in the same
hidden layer, are interconnected that leads to rapid convergence. With the
experimental study of our proposed model in deep networks, we demonstrate that
the model results in a noticeable increase in convergence rate compared to the
conventional feed-forward neural network.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Web Neural Network with Complete DiGraphs [8.2727500676707]
Current neural networks have structures that vaguely mimic the brain structure, such as neurons, convolutions, and recurrence.
The model proposed in this paper adds additional structural properties by introducing cycles into the neuron connections and removing the sequential nature commonly seen in other network layers.
Furthermore, the model has continuous input and output, inspired by spiking neural networks, which allows the network to learn a process of classification, rather than simply returning the final result.
arXiv Detail & Related papers (2024-01-07T05:12:10Z) - Expressivity of Spiking Neural Networks [15.181458163440634]
We study the capabilities of spiking neural networks where information is encoded in the firing time of neurons.
In contrast to ReLU networks, we prove that spiking neural networks can realize both continuous and discontinuous functions.
arXiv Detail & Related papers (2023-08-16T08:45:53Z) - Mitigating Communication Costs in Neural Networks: The Role of Dendritic
Nonlinearity [28.243134476634125]
In this study, we scrutinized the importance of nonlinear dendrites within neural networks.
Our findings reveal that integrating dendritic structures can substantially enhance model capacity and performance.
arXiv Detail & Related papers (2023-06-21T00:28:20Z) - Dive into the Power of Neuronal Heterogeneity [8.6837371869842]
We show the challenges faced by backpropagation-based methods in optimizing Spiking Neural Networks (SNNs) and achieve more robust optimization of heterogeneous neurons in random networks using an Evolutionary Strategy (ES)
We find that membrane time constants play a crucial role in neural heterogeneity, and their distribution is similar to that observed in biological experiments.
arXiv Detail & Related papers (2023-05-19T07:32:29Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Functional Connectome: Approximating Brain Networks with Artificial
Neural Networks [1.952097552284465]
We show that trained deep neural networks are able to capture the computations performed by synthetic biological networks with high accuracy.
We show that trained deep neural networks are able to perform zero-shot generalisation in novel environments.
Our study reveals a novel and promising direction in systems neuroscience, and can be expanded upon with a multitude of downstream applications.
arXiv Detail & Related papers (2022-11-23T13:12:13Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.