Diverse Neural Sequences in QIF Networks: An Analytically Tractable Framework for Synfire Chains and Hippocampal Replay
- URL: http://arxiv.org/abs/2508.06085v1
- Date: Fri, 08 Aug 2025 07:27:47 GMT
- Title: Diverse Neural Sequences in QIF Networks: An Analytically Tractable Framework for Synfire Chains and Hippocampal Replay
- Authors: Genki Shimizu, Taro Toyoizumi,
- Abstract summary: We propose a parsimonious network of Quadratic Integrate-and-Fire neurons with sequences embedded via a temporally asymmetric Hebbian rule.<n>Our findings demonstrate that this single framework robustly reproduces a spectrum of sequential activities, including persistent synfire-like chains and transient, hippocampal replay-like bursts exhibiting intra-ripple frequency accommodation (IFA)<n>These results establish QIF networks with TAH connectivity as an analytically tractable and biologically plausible platform for investigating the emergence, stability, and diversity of sequential neural activity in the brain.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequential neural activity is fundamental to cognition, yet how diverse sequences are recalled under biological constraints remains a key question. Existing models often struggle to balance biophysical realism and analytical tractability. We address this problem by proposing a parsimonious network of Quadratic Integrate-and-Fire (QIF) neurons with sequences embedded via a temporally asymmetric Hebbian (TAH) rule. Our findings demonstrate that this single framework robustly reproduces a spectrum of sequential activities, including persistent synfire-like chains and transient, hippocampal replay-like bursts exhibiting intra-ripple frequency accommodation (IFA), all achieved without requiring specialized delay or adaptation mechanisms. Crucially, we derive exact low-dimensional firing-rate equations (FREs) that provide mechanistic insight, elucidating the bifurcation structure governing these distinct dynamical regimes and explaining their stability. The model also exhibits strong robustness to synaptic heterogeneity and memory pattern overlap. These results establish QIF networks with TAH connectivity as an analytically tractable and biologically plausible platform for investigating the emergence, stability, and diversity of sequential neural activity in the brain.
Related papers
- Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks [14.487258585834374]
Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs)<n>We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation.
arXiv Detail & Related papers (2026-01-13T11:17:30Z) - A Brain-Inspired Gating Mechanism Unlocks Robust Computation in Spiking Neural Networks [5.647576619206974]
We introduce the Dynamic Gated Neuron(DGN), a novel spiking unit in which membrane conductance evolves in response to neuronal activity.<n>Our results highlight, for the first time, a biologically plausible dynamic gating as a key mechanism for robust spike-based computation.
arXiv Detail & Related papers (2025-09-03T13:00:49Z) - HetSyn: Versatile Timescale Integration in Spiking Neural Networks via Heterogeneous Synapses [3.744763853474646]
Spiking Neural Networks (SNNs) offer a biologically plausible and energy-efficient framework for temporal information processing.<n>We introduce HetSyn, a framework that models synaptic heterogeneity with synapse-specific time constants.<n>We demonstrate that HetSynLIF improves the performance of SNNs across a variety of tasks.
arXiv Detail & Related papers (2025-08-01T10:19:56Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Generative System Dynamics in Recurrent Neural Networks [56.958984970518564]
We investigate the continuous time dynamics of Recurrent Neural Networks (RNNs)<n>We show that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations.<n> Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process.
arXiv Detail & Related papers (2025-04-16T10:39:43Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.<n>In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
It has long been known in both neuroscience and AI that ''binding'' between neurons leads to a form of competitive learning where representations are compressed in order to represent more abstract concepts in deeper layers of the network.<n>We introduce Artificial rethinking together with arbitrary connectivity designs such as fully connected convolutional, or attentive mechanisms.<n>We show that this idea provides performance improvements across a wide spectrum of tasks such as unsupervised object discovery, adversarial robustness, uncertainty, quantification, and reasoning.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Unconditional stability of a recurrent neural circuit implementing divisive normalization [0.0]
We prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit.<n>We show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling.
arXiv Detail & Related papers (2024-09-27T17:46:05Z) - Unbalanced Diffusion Schr\"odinger Bridge [71.31485908125435]
We introduce unbalanced DSBs which model the temporal evolution of marginals with arbitrary finite mass.
This is achieved by deriving the time reversal of differential equations with killing and birth terms.
We present two novel algorithmic schemes that comprise a scalable objective function for training unbalanced DSBs.
arXiv Detail & Related papers (2023-06-15T12:51:56Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Theory of coupled neuronal-synaptic dynamics [3.626013617212667]
In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics.
We study a recurrent-network model in which neuronal units and synaptic couplings are interacting dynamic variables.
We show that adding Hebbian plasticity slows activity in chaotic networks and can induce chaos.
arXiv Detail & Related papers (2023-02-17T16:42:59Z) - Latent Equilibrium: A unified learning theory for arbitrarily fast
computation with arbitrarily slow neurons [0.7340017786387767]
We introduce Latent Equilibrium, a new framework for inference and learning in networks of slow components.
We derive disentangled neuron and synapse dynamics from a prospective energy function.
We show how our principle can be applied to detailed models of cortical microcircuitry.
arXiv Detail & Related papers (2021-10-27T16:15:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.