Brain-inspired global-local learning incorporated with neuromorphic
computing
- URL: http://arxiv.org/abs/2006.03226v3
- Date: Tue, 22 Jun 2021 01:15:53 GMT
- Title: Brain-inspired global-local learning incorporated with neuromorphic
computing
- Authors: Yujie Wu, Rong Zhao, Jun Zhu, Feng Chen, Mingkun Xu, Guoqi Li, Sen
Song, Lei Deng, Guanrui Wang, Hao Zheng, Jing Pei, Youhui Zhang, Mingguo
Zhao, and Luping Shi
- Abstract summary: We report a neuromorphic hybrid learning model by introducing a brain-inspired meta-learning paradigm and a differentiable spiking model incorporating neuronal dynamics and synaptic plasticity.
We demonstrate the advantages of this model in multiple different tasks, including few-shot learning, continual learning, and fault-tolerance learning in neuromorphic vision sensors.
- Score: 35.70151531581922
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Two main routes of learning methods exist at present including error-driven
global learning and neuroscience-oriented local learning. Integrating them into
one network may provide complementary learning capabilities for versatile
learning scenarios. At the same time, neuromorphic computing holds great
promise, but still needs plenty of useful algorithms and algorithm-hardware
co-designs for exploiting the advantages. Here, we report a neuromorphic hybrid
learning model by introducing a brain-inspired meta-learning paradigm and a
differentiable spiking model incorporating neuronal dynamics and synaptic
plasticity. It can meta-learn local plasticity and receive top-down supervision
information for multiscale synergic learning. We demonstrate the advantages of
this model in multiple different tasks, including few-shot learning, continual
learning, and fault-tolerance learning in neuromorphic vision sensors. It
achieves significantly higher performance than single-learning methods, and
shows promise in empowering neuromorphic applications revolution. We further
implemented the hybrid model in the Tianjic neuromorphic platform by exploiting
algorithm-hardware co-designs and proved that the model can fully utilize
neuromorphic many-core architecture to develop hybrid computation paradigm.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Emulating Brain-like Rapid Learning in Neuromorphic Edge Computing [3.735012564657653]
Digital neuromorphic technology simulates the neural and synaptic processes of the brain using two stages of learning.
We demonstrate our approach using event-driven vision sensor data and the Intel Loihi neuromorphic processor with its plasticity dynamics.
Our methodology can be deployed with arbitrary plasticity models and can be applied to situations demanding quick learning and adaptation at the edge.
arXiv Detail & Related papers (2024-08-28T13:51:52Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - A Differentiable Approach to Multi-scale Brain Modeling [3.5874544981360987]
We present a multi-scale differentiable brain modeling workflow utilizing BrainPy, a unique differentiable brain simulator.
At the single-neuron level, we implement differentiable neuron models and employ gradient methods to optimize their fit to electrophysiological data.
On the network level, we incorporate connectomic data to construct biologically constrained network models.
arXiv Detail & Related papers (2024-06-28T07:41:31Z) - Unsupervised representation learning with Hebbian synaptic and structural plasticity in brain-like feedforward neural networks [0.0]
We introduce and evaluate a brain-like neural network model capable of unsupervised representation learning.
The model was tested on a diverse set of popular machine learning benchmarks.
arXiv Detail & Related papers (2024-06-07T08:32:30Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - Multimodal foundation models are better simulators of the human brain [65.10501322822881]
We present a newly-designed multimodal foundation model pre-trained on 15 million image-text pairs.
We find that both visual and lingual encoders trained multimodally are more brain-like compared with unimodal ones.
arXiv Detail & Related papers (2022-08-17T12:36:26Z) - Learning to learn online with neuromodulated synaptic plasticity in
spiking neural networks [0.0]
We show that models of neuromodulated synaptic plasticity from neuroscience can be trained to learn through gradient descent.
This framework opens a new path toward developing neuroscience inspired online learning algorithms.
arXiv Detail & Related papers (2022-06-25T00:28:40Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.