Parallel Delayed Memory Units for Enhanced Temporal Modeling in Biomedical and Bioacoustic Signal Analysis
- URL: http://arxiv.org/abs/2512.01626v1
- Date: Mon, 01 Dec 2025 12:46:56 GMT
- Title: Parallel Delayed Memory Units for Enhanced Temporal Modeling in Biomedical and Bioacoustic Signal Analysis
- Authors: Pengfei Sun, Wenyu Jiang, Paul Devos, Dick Botteldooren,
- Abstract summary: Parallel Delayed Memory Unit (PDMU) is a delay-gated state-space module for short-term temporal credit assignment.<n>PDMU enhances short-term temporal state interactions and memory efficiency via a gated delay-line mechanism.<n>PDMU supports parallel training and sequential inference, and can be easily integrated into existing linear RNN frameworks.
- Score: 17.73662095849247
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Advanced deep learning architectures, particularly recurrent neural networks (RNNs), have been widely applied in audio, bioacoustic, and biomedical signal analysis, especially in data-scarce environments. While gated RNNs remain effective, they can be relatively over-parameterised and less training-efficient in some regimes, while linear RNNs tend to fall short in capturing the complexity inherent in bio-signals. To address these challenges, we propose the Parallel Delayed Memory Unit (PDMU), a {delay-gated state-space module for short-term temporal credit assignment} targeting audio and bioacoustic signals, which enhances short-term temporal state interactions and memory efficiency via a gated delay-line mechanism. Unlike previous Delayed Memory Units (DMU) that embed temporal dynamics into the delay-line architecture, the PDMU further compresses temporal information into vector representations using Legendre Memory Units (LMU). This design serves as a form of causal attention, allowing the model to dynamically adjust its reliance on past states and improve real-time learning performance. Notably, in low-information scenarios, the gating mechanism behaves similarly to skip connections by bypassing state decay and preserving early representations, thereby facilitating long-term memory retention. The PDMU is modular, supporting parallel training and sequential inference, and can be easily integrated into existing linear RNN frameworks. Furthermore, we introduce bidirectional, efficient, and spiking variants of the architecture, each offering additional gains in performance or energy efficiency. Experimental results on diverse audio and biomedical benchmarks demonstrate that the PDMU significantly enhances both memory capacity and overall model performance.
Related papers
- MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models [51.506429027626005]
Memory for Time Series (MEMTS) is a lightweight and plug-and-play method for retrieval-free domain adaptation in time series forecasting.<n>Key component of MEMTS is a Knowledge Persistence Module (KPM), which internalizes domain-specific temporal dynamics.<n>This paradigm shift enables MEMTS to achieve accurate domain adaptation with constant-time inference and near-zero latency.
arXiv Detail & Related papers (2026-02-14T14:00:06Z) - Moving Beyond Functional Connectivity: Time-Series Modeling for fMRI-Based Brain Disorder Classification [8.837732238971187]
Functional magnetic resonance imaging (fMRI) enables non-invasive brain disorder classification by capturing blood-oxygen-level-dependent (BOLD) signals.<n>Most existing methods rely on functional connectivity (FC) via Pearson correlation.<n>We benchmark state-of-the-art temporal models on raw BOLD signals across five public datasets.
arXiv Detail & Related papers (2026-02-09T04:42:42Z) - NeuroSSM: Multiscale Differential State-Space Modeling for Context-Aware fMRI Analysis [4.753690672619091]
We propose NeuroSSM, a selective state-space architecture designed for end-to-end analysis of raw BOLD signals in fMRI time series.<n>NeuroSSM addresses the above limitations through two complementary design components.<n> Experiments on clinical and non-clinical datasets demonstrate that NeuroSSM achieves competitive performance and efficiency against state-of-the-art fMRI analysis methods.
arXiv Detail & Related papers (2026-01-03T16:35:45Z) - RMAAT: Astrocyte-Inspired Memory Compression and Replay for Efficient Long-Context Transformers [11.099872871193028]
This work explores computational principles derived from astrocytes-glial cells critical for biological memory and synaptic modulation.<n>We introduce the Recurrent Memory Augmented Transformer (RMAAT), an architecture integrating astrocyte functionalities.
arXiv Detail & Related papers (2026-01-01T18:34:06Z) - DNN-Based Precoding in RIS-Aided mmWave MIMO Systems With Practical Phase Shift [43.56429251312585]
This paper investigates maximizing the throughput of millimeter wave (mmWave) multiple-input multiple-output (MIMO) systems with obstructed direct communication paths.<n>A reconfigurable intelligent surface (RIS) is employed to enhance transmissions, considering mmWave characteristics related to line-of-sight (LoS) and multipath effects.<n>Deep neural network (DNN) is developed to facilitate faster codeword selection.
arXiv Detail & Related papers (2025-07-03T17:35:06Z) - Neuromorphic Wireless Split Computing with Resonate-and-Fire Neurons [69.73249913506042]
This paper investigates a wireless split computing architecture that employs resonate-and-fire (RF) neurons to process time-domain signals directly.<n>By resonating at tunable frequencies, RF neurons extract time-localized spectral features while maintaining low spiking activity.<n> Experimental results show that the proposed RF-SNN architecture achieves comparable accuracy to conventional LIF-SNNs and ANNs.
arXiv Detail & Related papers (2025-06-24T21:14:59Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [42.60778405812048]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - An Efficient Self-Supervised Framework for Long-Sequence EEG Modeling [2.1232375739287006]
We propose EEGM2, a self-supervised framework for EEG representation learning.<n>EEGM2 achieves state-of-the-art performance in both short- and long-sequence modeling and classification.
arXiv Detail & Related papers (2025-02-25T05:57:56Z) - REST: Efficient and Accelerated EEG Seizure Analysis through Residual State Updates [54.96885726053036]
This paper introduces a novel graph-based residual state update mechanism (REST) for real-time EEG signal analysis.
By leveraging a combination of graph neural networks and recurrent structures, REST efficiently captures both non-Euclidean geometry and temporal dependencies within EEG data.
Our model demonstrates high accuracy in both seizure detection and classification tasks.
arXiv Detail & Related papers (2024-06-03T16:30:19Z) - PRES: Toward Scalable Memory-Based Dynamic Graph Neural Networks [22.47336262812308]
Memory-based Dynamic Graph Neural Networks (MDGNNs) are a family of dynamic graph neural networks that leverage a memory module to extract, distill, and long-term temporal dependencies.
This paper studies the efficient training of MDGNNs at scale, focusing on the temporal discontinuity in training MDGNNs with large temporal batch sizes.
arXiv Detail & Related papers (2024-02-06T01:34:56Z) - Delayed Memory Unit: Modelling Temporal Dependency Through Delay Gate [16.4160685571157]
Recurrent Neural Networks (RNNs) are widely recognized for their proficiency in modeling temporal dependencies.
This paper proposes a novel Delayed Memory Unit (DMU) for gated RNNs.
The DMU incorporates a delay line structure along with delay gates into vanilla RNN, thereby enhancing temporal interaction and facilitating temporal credit assignment.
arXiv Detail & Related papers (2023-10-23T14:29:48Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Recurrence-in-Recurrence Networks for Video Deblurring [58.49075799159015]
State-of-the-art video deblurring methods often adopt recurrent neural networks to model the temporal dependency between the frames.
In this paper, we propose recurrence-in-recurrence network architecture to cope with the limitations of short-ranged memory.
arXiv Detail & Related papers (2022-03-12T11:58:13Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.