Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training
- URL: http://arxiv.org/abs/2507.16937v1
- Date: Tue, 22 Jul 2025 18:20:56 GMT
- Title: Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training
- Authors: Chengjie Ge, Yufeng Peng, Xueyang Fu, Qiyu Kang, Xuhao Li, Qixin Zhang, Junhao Ren, Zheng-Jun Zha,
- Abstract summary: Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
- Score: 63.3991315762955
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation, demonstrating effectiveness in processing temporal information with energy efficiency and biological realism. Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics. Consequently, the voltage state at any time depends solely on its immediate past value, potentially limiting network expressiveness. Real neurons, however, exhibit complex dynamics influenced by long-term correlations and fractal dendritic structures, suggesting non-Markovian behavior. Motivated by this, we propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics. These fractional dynamics enable more expressive temporal patterns beyond the capability of integer-order models. For efficient training of fspikeDE, we introduce a gradient descent algorithm that optimizes parameters by solving an augmented fractional-order ODE (FDE) backward in time using adjoint sensitivity methods. Extensive experiments on diverse image and graph datasets demonstrate that fspikeDE consistently outperforms traditional SNNs, achieving superior accuracy, comparable energy efficiency, reduced training memory usage, and enhanced robustness against noise. Our approach provides a novel open-sourced computational toolbox for fractional-order SNNs, widely applicable to various real-world tasks.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Neural Variable-Order Fractional Differential Equation Networks [26.06048802504022]
We introduce the Neural Variable-Order Fractional Differential Equation network (NvoFDE)<n>NvoFDE is a novel neural network framework that integrates variable-order fractional derivatives with learnable neural networks.<n>Our results demonstrate that NvoFDE outperforms traditional constant-order fractional and integer models across a range of tasks.
arXiv Detail & Related papers (2025-03-20T14:54:19Z) - Channel-wise Parallelizable Spiking Neuron with Multiplication-free Dynamics and Large Temporal Receptive Fields [32.349167886062105]
Spiking Neural Networks (SNNs) are distinguished from Artificial Neural Networks (ANNs) for their sophisticated neuronal dynamics and sparse binary activations (spikes) inspired by the biological neural system.<n>Traditional neuron models use iterative step-by-step dynamics, resulting in serial computation and slow training speed of SNNs.<n>Recent parallelizable spiking neuron models have been proposed to fully utilize the massive parallel computing ability of graphics processing units to accelerate the training of SNNs.
arXiv Detail & Related papers (2025-01-24T13:44:08Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Advancing Spatio-Temporal Processing in Spiking Neural Networks through Adaptation [6.233189707488025]
neural networks on neuromorphic hardware promise orders of less power consumption than their non-spiking counterparts.<n>Standard neuron model for spike-based computation on such systems has long been the integrate-and-fire (LIF) neuron.<n>The root of these so-called adaptive LIF neurons is not well understood.
arXiv Detail & Related papers (2024-08-14T12:49:58Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Neural Delay Differential Equations: System Reconstruction and Image
Classification [14.59919398960571]
We propose a new class of continuous-depth neural networks with delay, named Neural Delay Differential Equations (NDDEs)
Compared to NODEs, NDDEs have a stronger capacity of nonlinear representations.
We achieve lower loss and higher accuracy not only for the data produced synthetically but also for the CIFAR10, a well-known image dataset.
arXiv Detail & Related papers (2023-04-11T16:09:28Z) - Neural Piecewise-Constant Delay Differential Equations [17.55759866368141]
In this article, we introduce a new sort of continuous-depth neural network, called the Neural Piecewise-Constant Delay Differential Equations (PCDDEs)
We show that the Neural PCDDEs do outperform the several existing continuous-depth neural frameworks on the one-dimensional piecewise-constant delay population dynamics and real-world datasets.
arXiv Detail & Related papers (2022-01-04T03:44:15Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.