A Neuromorphic Paradigm for Online Unsupervised Clustering
- URL: http://arxiv.org/abs/2005.04170v1
- Date: Sat, 25 Apr 2020 14:02:34 GMT
- Title: A Neuromorphic Paradigm for Online Unsupervised Clustering
- Authors: James E. Smith
- Abstract summary: A computational paradigm based on neuroscientific concepts is proposed and shown to be capable of online unsupervised clustering.
All operations, both training and inference, are localized and efficient.
The prototype column is simulated with a semi-synthetic benchmark and is shown to have performance characteristics on par with classic k-means.
- Score: 0.6091702876917281
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A computational paradigm based on neuroscientific concepts is proposed and
shown to be capable of online unsupervised clustering. Because it is an online
method, it is readily amenable to streaming realtime applications and is
capable of dynamically adjusting to macro-level input changes. All operations,
both training and inference, are localized and efficient. The paradigm is
implemented as a cognitive column that incorporates five key elements: 1)
temporal coding, 2) an excitatory neuron model for inference, 3)
winner-take-all inhibition, 4) a column architecture that combines excitation
and inhibition, 5) localized training via spike timing de-pendent plasticity
(STDP). These elements are described and discussed, and a prototype column is
given. The prototype column is simulated with a semi-synthetic benchmark and is
shown to have performance characteristics on par with classic k-means.
Simulations reveal the inner operation and capabilities of the column with
emphasis on excitatory neuron response functions and STDP implementations.
Related papers
- EulerFormer: Sequential User Behavior Modeling with Complex Vector Attention [88.45459681677369]
We propose a novel transformer variant with complex vector attention, named EulerFormer.
It provides a unified theoretical framework to formulate both semantic difference and positional difference.
It is more robust to semantic variations and possesses moresuperior theoretical properties in principle.
arXiv Detail & Related papers (2024-03-26T14:18:43Z) - Enhancing Neural Training via a Correlated Dynamics Model [2.9302545029880394]
Correlation Mode Decomposition (CMD) is an algorithm that clusters the parameter space into groups, that display synchronized behavior across epochs.
We introduce an efficient CMD variant, designed to run concurrently with training.
Our experiments indicate that CMD surpasses the state-of-the-art method for compactly modeled dynamics on image classification.
arXiv Detail & Related papers (2023-12-20T18:22:49Z) - Sparse Modular Activation for Efficient Sequence Modeling [94.11125833685583]
Recent models combining Linear State Space Models with self-attention mechanisms have demonstrated impressive results across a range of sequence modeling tasks.
Current approaches apply attention modules statically and uniformly to all elements in the input sequences, leading to sub-optimal quality-efficiency trade-offs.
We introduce Sparse Modular Activation (SMA), a general mechanism enabling neural networks to sparsely activate sub-modules for sequence elements in a differentiable manner.
arXiv Detail & Related papers (2023-06-19T23:10:02Z) - Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems [73.18020682258606]
We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel.
We propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP)
Our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - ETLP: Event-based Three-factor Local Plasticity for online learning with
neuromorphic hardware [105.54048699217668]
We show a competitive performance in accuracy with a clear advantage in the computational complexity for Event-Based Three-factor Local Plasticity (ETLP)
We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learntemporal patterns with a rich temporal structure.
arXiv Detail & Related papers (2023-01-19T19:45:42Z) - Hippocampus-Inspired Cognitive Architecture (HICA) for Operant
Conditioning [1.2955718209635252]
We propose a Hippocampus-Inspired Cognitive Architecture (HICA) as a neural mechanism for operant conditioning.
HICA is composed of two different types of modules.
arXiv Detail & Related papers (2022-12-16T18:00:21Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Theory of gating in recurrent neural networks [5.672132510411465]
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience.
Here, we show that gating offers flexible control of two salient features of the collective dynamics.
The gate controlling timescales leads to a novel, marginally stable state, where the network functions as a flexible integrator.
arXiv Detail & Related papers (2020-07-29T13:20:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.