Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics
Organized by Astrocyte-modulated Plasticity
- URL: http://arxiv.org/abs/2111.01760v1
- Date: Tue, 26 Oct 2021 23:04:40 GMT
- Title: Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics
Organized by Astrocyte-modulated Plasticity
- Authors: Vladimir A. Ivanov, Konstantinos P. Michmizos
- Abstract summary: Liquid state machine (LSM) tunes internal weights without backpropagation of gradients.
Recent findings suggest that astrocytes, a long-neglected non-neuronal brain cell, modulate synaptic plasticity and brain dynamics.
We propose the neuron-astrocyte liquid state machine (NALSM) that addresses under-performance through self-organized near-critical dynamics.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The liquid state machine (LSM) combines low training complexity and
biological plausibility, which has made it an attractive machine learning
framework for edge and neuromorphic computing paradigms. Originally proposed as
a model of brain computation, the LSM tunes its internal weights without
backpropagation of gradients, which results in lower performance compared to
multi-layer neural networks. Recent findings in neuroscience suggest that
astrocytes, a long-neglected non-neuronal brain cell, modulate synaptic
plasticity and brain dynamics, tuning brain networks to the vicinity of the
computationally optimal critical phase transition between order and chaos.
Inspired by this disruptive understanding of how brain networks self-tune, we
propose the neuron-astrocyte liquid state machine (NALSM) that addresses
under-performance through self-organized near-critical dynamics. Similar to its
biological counterpart, the astrocyte model integrates neuronal activity and
provides global feedback to spike-timing-dependent plasticity (STDP), which
self-organizes NALSM dynamics around a critical branching factor that is
associated with the edge-of-chaos. We demonstrate that NALSM achieves
state-of-the-art accuracy versus comparable LSM methods, without the need for
data-specific hand-tuning. With a top accuracy of 97.61% on MNIST, 97.51% on
N-MNIST, and 85.84% on Fashion-MNIST, NALSM achieved comparable performance to
current fully-connected multi-layer spiking neural networks trained via
backpropagation. Our findings suggest that the further development of
brain-inspired machine learning methods has the potential to reach the
performance of deep learning, with the added benefits of supporting robust and
energy-efficient neuromorphic computing on the edge.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Astromorphic Self-Repair of Neuromorphic Hardware Systems [0.8958368012475248]
This paper attempts to explore the self-repair role of glial cells, in particular, astrocytes.
Hardware-software co-design analysis reveals that bio-morphic astrocytic regulation has the potential to self-repair hardware realistic faults.
arXiv Detail & Related papers (2022-09-15T16:23:45Z) - Modeling Associative Plasticity between Synapses to Enhance Learning of
Spiking Neural Networks [4.736525128377909]
Spiking Neural Networks (SNNs) are the third generation of artificial neural networks that enable energy-efficient implementation on neuromorphic hardware.
We propose a robust and effective learning mechanism by modeling the associative plasticity between synapses.
Our approaches achieve superior performance on static and state-of-the-art neuromorphic datasets.
arXiv Detail & Related papers (2022-07-24T06:12:23Z) - On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs [1.0009912692042526]
This work goes beyond the focus of current neuromorphic computing architectures on computational models for neuron and synapse.
We explore the role of glial cells in fault-tolerant capacity of Spiking Neural Networks trained in an unsupervised fashion using Spike-Timing Dependent Plasticity (STDP)
We characterize the degree of self-repair that can be enabled in such networks with varying degree of faults ranging from 50% - 90% and evaluate our proposal on the MNIST and Fashion-MNIST datasets.
arXiv Detail & Related papers (2020-09-08T01:14:53Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.