Exploring Structural Nonlinearity in Binary Polariton-Based Neuromorphic Architectures
- URL: http://arxiv.org/abs/2411.06124v1
- Date: Sat, 09 Nov 2024 09:29:46 GMT
- Title: Exploring Structural Nonlinearity in Binary Polariton-Based Neuromorphic Architectures
- Authors: Evgeny Sedov, Alexey Kavokin,
- Abstract summary: We show that structural nonlinearity, derived from the network's layout, plays a crucial role in facilitating complex computational tasks.
This shift in focus from individual neuron properties to network architecture could lead to significant advancements in the efficiency and applicability of neuromorphic computing.
- Score: 0.0
- License:
- Abstract: This study investigates the performance of a binarized neuromorphic network leveraging polariton dyads, optically excited pairs of interfering polariton condensates within a microcavity to function as binary logic gate neurons. Employing numerical simulations, we explore various neuron configurations, both linear (NAND, NOR) and nonlinear (XNOR), to assess their effectiveness in image classification tasks. We demonstrate that structural nonlinearity, derived from the network's layout, plays a crucial role in facilitating complex computational tasks, effectively reducing the reliance on the inherent nonlinearity of individual neurons. Our findings suggest that the network's configuration and the interaction among its elements can emulate the benefits of nonlinearity, thus potentially simplifying the design and manufacturing of neuromorphic systems and enhancing their scalability. This shift in focus from individual neuron properties to network architecture could lead to significant advancements in the efficiency and applicability of neuromorphic computing.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Non-linear classification capability of quantum neural networks due to emergent quantum metastability [0.0]
We show that effective non-linearities can be implemented in quantum neural networks.
By using a quantum neural network whose architecture is inspired by dissipative many-body quantum spin models, we show that this mechanism indeed allows to realize non-linear data classification.
arXiv Detail & Related papers (2024-08-20T12:01:07Z) - Mitigating Communication Costs in Neural Networks: The Role of Dendritic
Nonlinearity [28.243134476634125]
In this study, we scrutinized the importance of nonlinear dendrites within neural networks.
Our findings reveal that integrating dendritic structures can substantially enhance model capacity and performance.
arXiv Detail & Related papers (2023-06-21T00:28:20Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Random Graph-Based Neuromorphic Learning with a Layer-Weaken Structure [4.477401614534202]
We transform the random graph theory into an NN model with practical meaning and based on clarifying the input-output relationship of each neuron.
Under the usage of this low-operation cost approach, neurons are assigned to several groups of which connection relationships can be regarded as uniform representations of random graphs they belong to.
We develop a joint classification mechanism involving information interaction between multiple RGNNs and realize significant performance improvements in supervised learning for three benchmark tasks.
arXiv Detail & Related papers (2021-11-17T03:37:06Z) - Stability Analysis of Fractional Order Memristor Synapse-coupled
Hopfield Neural Network with Ring Structure [0.0]
We first present a fractional-order memristor synapse-coupling Hopfield neural network on two neurons.
We extend the model to a neural network with a ring structure that consists of n sub-network neurons, increasing the synchronization in the network.
In the n-neuron case, it is revealed that the stability depends on the structure and number of sub-networks.
arXiv Detail & Related papers (2021-09-29T12:33:23Z) - On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs [1.0009912692042526]
This work goes beyond the focus of current neuromorphic computing architectures on computational models for neuron and synapse.
We explore the role of glial cells in fault-tolerant capacity of Spiking Neural Networks trained in an unsupervised fashion using Spike-Timing Dependent Plasticity (STDP)
We characterize the degree of self-repair that can be enabled in such networks with varying degree of faults ranging from 50% - 90% and evaluate our proposal on the MNIST and Fashion-MNIST datasets.
arXiv Detail & Related papers (2020-09-08T01:14:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.