NeurOptimisation: The Spiking Way to Evolve
- URL: http://arxiv.org/abs/2507.08320v1
- Date: Fri, 11 Jul 2025 05:18:13 GMT
- Title: NeurOptimisation: The Spiking Way to Evolve
- Authors: Jorge Mario Cruz-Duarte, El-Ghazali Talbi,
- Abstract summary: We present a fully spike-based framework that materialises the neuromorphic-based metaheuristic paradigm through a decentralised NC system.<n>We implement this framework on Intel's Lava platform, targeting the Loihi 2 chip, and evaluate it on the noiseless BBOB suite up to 40 dimensions.<n>Results show that the proposed approach exhibits structured population dynamics, consistent convergence, and milliwatt-level power feasibility.
- Score: 0.3069335774032178
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The increasing energy footprint of artificial intelligence systems urges alternative computational models that are both efficient and scalable. Neuromorphic Computing (NC) addresses this challenge by empowering event-driven algorithms that operate with minimal power requirements through biologically inspired spiking dynamics. We present the NeurOptimiser, a fully spike-based optimisation framework that materialises the neuromorphic-based metaheuristic paradigm through a decentralised NC system. The proposed approach comprises a population of Neuromorphic Heuristic Units (NHUs), each combining spiking neuron dynamics with spike-triggered perturbation heuristics to evolve candidate solutions asynchronously. The NeurOptimiser's coordination arises through native spiking mechanisms that support activity propagation, local information sharing, and global state updates without external orchestration. We implement this framework on Intel's Lava platform, targeting the Loihi 2 chip, and evaluate it on the noiseless BBOB suite up to 40 dimensions. We deploy several NeurOptimisers using different configurations, mainly considering dynamic systems such as linear and Izhikevich models for spiking neural dynamics, and fixed and Differential Evolution mutation rules for spike-triggered heuristics. Although these configurations are implemented as a proof of concept, we document and outline further extensions and improvements to the framework implementation. Results show that the proposed approach exhibits structured population dynamics, consistent convergence, and milliwatt-level power feasibility. They also position spike-native MHs as a viable path toward real-time, low-energy, and decentralised optimisation.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.<n>In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Advancing Spatio-Temporal Processing in Spiking Neural Networks through Adaptation [6.233189707488025]
neural networks on neuromorphic hardware promise orders of less power consumption than their non-spiking counterparts.<n>Standard neuron model for spike-based computation on such systems has long been the integrate-and-fire (LIF) neuron.<n>The root of these so-called adaptive LIF neurons is not well understood.
arXiv Detail & Related papers (2024-08-14T12:49:58Z) - An Attempt to Devise a Pairwise Ising-Type Maximum Entropy Model Integrated Cost Function for Optimizing SNN Deployment [0.0]
Spiking Neural Networks (SNNs) emulate the spiking behavior of biological neurons and are typically deployed on distributed-memory neuromorphic hardware.<n>We model SNN dynamics using an Ising-type pairwise interaction framework, bridging microscopic neuron interactions with macroscopic network behavior.<n>We evaluate our approach on two SNNs deployed on the sPyNNaker neuromorphic platform.
arXiv Detail & Related papers (2024-07-09T16:33:43Z) - ON-OFF Neuromorphic ISING Machines using Fowler-Nordheim Annealers [4.429465736433621]
We introduce NeuroSA, a neuromorphic architecture specifically designed to ensure convergence to the ground state of an Ising problem.<n>Across multiple runs, NeuroSA consistently generates solutions that are concentrated around the state-of-the-art results (within 99%) or surpass the current state-of-the-art solutions for Max Independent Set benchmarks.<n>For practical illustration, we present results from an implementation of NeuroSA on the SpiNNaker2 platform.
arXiv Detail & Related papers (2024-06-07T19:18:09Z) - Hallmarks of Optimization Trajectories in Neural Networks: Directional Exploration and Redundancy [75.15685966213832]
We analyze the rich directional structure of optimization trajectories represented by their pointwise parameters.
We show that training only scalar batchnorm parameters some while into training matches the performance of training the entire network.
arXiv Detail & Related papers (2024-03-12T07:32:47Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Astrocyte-Integrated Dynamic Function Exchange in Spiking Neural
Networks [0.0]
This paper presents an innovative methodology for improving the robustness and computational efficiency of Spiking Neural Networks (SNNs)
The proposed approach integrates astrocytes, a type of glial cell prevalent in the human brain, into SNNs, creating astrocyte-augmented networks.
Notably, our astrocyte-augmented SNN displays near-zero latency and theoretically infinite throughput, implying exceptional computational efficiency.
arXiv Detail & Related papers (2023-09-15T08:02:29Z) - Evolving Connectivity for Recurrent Spiking Neural Networks [8.80300633999542]
Recurrent neural networks (RSNNs) hold great potential for advancing artificial general intelligence.
We propose the evolving connectivity (EC) framework, an inference-only method for training RSNNs.
arXiv Detail & Related papers (2023-05-28T07:08:25Z) - Dynamics with autoregressive neural quantum states: application to
critical quench dynamics [41.94295877935867]
We present an alternative general scheme that enables one to capture long-time dynamics of quantum systems in a stable fashion.
We apply the scheme to time-dependent quench dynamics by investigating the Kibble-Zurek mechanism in the two-dimensional quantum Ising model.
arXiv Detail & Related papers (2022-09-07T15:50:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.