Memory-DD: A Low-Complexity Dendrite-Inspired Neuron for Temporal Prediction Tasks
- URL: http://arxiv.org/abs/2512.04094v1
- Date: Thu, 20 Nov 2025 12:03:40 GMT
- Title: Memory-DD: A Low-Complexity Dendrite-Inspired Neuron for Temporal Prediction Tasks
- Authors: Dongjian Yang, Xiaoyuan Li, Chuanmei Xi, Ye Sun, Gang Liu,
- Abstract summary: Existing dendrite-inspired neurons are mainly designed for static data.<n>Memory-DD consists of two dendrite-inspired neuron groups that contain no nonlinear activation functions.<n>Results show that Memory-DD achieves an average accuracy of 89.41% on 18 temporal classification benchmark datasets.
- Score: 3.9564350814920286
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Dendrite-inspired neurons have been widely used in tasks such as image classification due to low computational complexity and fast inference speed. Temporal data prediction, as a key machine learning task, plays a key role in real-time scenarios such as sensor data analysis, financial forecasting, and urban traffic management. However, existing dendrite-inspired neurons are mainly designed for static data. Studies on capturing dynamic features and modeling long-term dependencies in temporal sequences remain limited. Efficient architectures specifically designed for temporal sequence prediction are still lacking. In this paper, we propose Memory-DD, a low-complexity dendrite-inspired neuron model. Memory-DD consists of two dendrite-inspired neuron groups that contain no nonlinear activation functions but can still realize nonlinear mappings. Compared with traditional neurons without dendritic functions, Memory-DD requires only two neuron groups to extract logical relationships between features in input sequences. This design effectively captures temporal dependencies and is suitable for both classification and regression tasks on sequence data. Experimental results show that Memory-DD achieves an average accuracy of 89.41% on 18 temporal classification benchmark datasets, outperforming LSTM by 4.25%. On 9 temporal regression datasets, it reaches comparable performance to LSTM, while using only 50% of the parameters and reducing computational complexity (FLOPs) by 27.7%. These results demonstrate that Memory-DD successfully extends the low-complexity advantages of dendrite-inspired neurons to temporal prediction, providing a low-complexity and efficient solution for time-series data processing.
Related papers
- Spatial Spiking Neural Networks Enable Efficient and Robust Temporal Computation [0.4104921880358479]
We introduce Spatial Spiking Neural Networks (SpSNNs), a framework in which neurons learn coordinates in a finite-dimensional Euclidean space.<n>We show that SpSNNs outperform SNNs with unconstrained delays despite using far fewer parameters.<n>Because learned spatial layouts map naturally onto hardware, SpSNNs lend themselves to efficient neuromorphic implementation.
arXiv Detail & Related papers (2025-12-10T19:01:39Z) - Delays in Spiking Neural Networks: A State Space Model Approach [2.309307613420651]
Spiking neural networks (SNNs) are biologically inspired, event-driven models suitable for processing temporal data.<n>We propose a general framework for incorporating delays into SNNs through additional state variables.<n>We show that the proposed mechanism matches the performance of existing delay-based SNNs while remaining computationally efficient.
arXiv Detail & Related papers (2025-12-01T17:26:21Z) - Dendritic Resonate-and-Fire Neuron for Effective and Efficient Long Sequence Modeling [66.0841376808143]
Dendritic Resonate-and-Fire (RF) neurons can efficiently extract frequency from input signals and encode them into spike trains.<n>RF neurons exhibit limited effective memory capacity and a trade-off between energy efficiency and training speed on complex tasks.<n>We propose a Dendritic Resonate-and-Fire (D-RF) model, which explicitly incorporates a multi-dendritic and soma architecture.
arXiv Detail & Related papers (2025-09-21T18:15:45Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - PRES: Toward Scalable Memory-Based Dynamic Graph Neural Networks [22.47336262812308]
Memory-based Dynamic Graph Neural Networks (MDGNNs) are a family of dynamic graph neural networks that leverage a memory module to extract, distill, and long-term temporal dependencies.
This paper studies the efficient training of MDGNNs at scale, focusing on the temporal discontinuity in training MDGNNs with large temporal batch sizes.
arXiv Detail & Related papers (2024-02-06T01:34:56Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Axonal Delay As a Short-Term Memory for Feed Forward Deep Spiking Neural
Networks [3.985532502580783]
Recent studies have found that the time delay of neurons plays an important role in the learning process.
configuring the precise timing of the spike is a promising direction for understanding and improving the transmission process of temporal information in SNNs.
In this paper, we verify the effectiveness of integrating time delay into supervised learning and propose a module that modulates the axonal delay through short-term memory.
arXiv Detail & Related papers (2022-04-20T16:56:42Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.