Egocentric Visual Navigation through Hippocampal Sequences
- URL: http://arxiv.org/abs/2510.09951v2
- Date: Wed, 15 Oct 2025 17:40:21 GMT
- Title: Egocentric Visual Navigation through Hippocampal Sequences
- Authors: Xiao-Xiong Lin, Yuk Hoi Yiu, Christian Leibold,
- Abstract summary: We show that hippocampal sequences arise from intrinsic recurrent circuitry that propagates activity without readily available input.<n>We implement a minimal sequence generator inspired by neurobiology and pair it with an actor-critic learner for egocentric visual navigation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Sequential activation of place-tuned neurons in an animal during navigation is typically interpreted as reflecting the sequence of input from adjacent positions along the trajectory. More recent theories about such place cells suggest sequences arise from abstract cognitive objectives like planning. Here, we propose a mechanistic and parsimonious interpretation to complement these ideas: hippocampal sequences arise from intrinsic recurrent circuitry that propagates activity without readily available input, acting as a temporal memory buffer for extremely sparse inputs. We implement a minimal sequence generator inspired by neurobiology and pair it with an actor-critic learner for egocentric visual navigation. Our agent reliably solves a continuous maze without explicit geometric cues, with performance depending on the length of the recurrent sequence. Crucially, the model outperforms LSTM cores under sparse input conditions (16 channels, ~2.5% activity), but not under dense input, revealing a strong interaction between representational sparsity and memory architecture. In contrast to LSTM agents, hidden sequence units develop localized place fields, distance-dependent spatial kernels, and task-dependent remapping, while inputs orthogonalize and spatial information increases across layers. These phenomena align with neurobiological data and are causal to performance. Together, our results show that sparse input synergizes with sequence-generating dynamics, providing both a mechanistic account of place cell sequences in the mammalian hippocampus and a simple inductive bias for reinforcement learning based on sparse egocentric inputs in navigation tasks.
Related papers
- SYNAPSE: Empowering LLM Agents with Episodic-Semantic Memory via Spreading Activation [29.545442480332515]
We introduce Synapse, a unified memory architecture that transcends static rather than pre-computed links.<n>We show that Synapse significantly outperforms state-of-the-art methods in complex temporal and multi-hop reasoning tasks.<n>Our code and data will be made publicly available upon acceptance.
arXiv Detail & Related papers (2026-01-06T06:19:58Z) - Neuro-Vesicles: Neuromodulation Should Be a Dynamical System, Not a Tensor Decoration [16.06187991858285]
We introduce Neuro-Vesicles, a framework that augments conventional neural networks with a missing computational layer.<n>Vesicles are mobile, discrete vesicles that live alongside the network rather than inside its tensors.<n>We give a complete mathematical specification of the framework, including emission, migration, docking, release, decay, and their coupling to learning.
arXiv Detail & Related papers (2025-12-07T19:19:12Z) - Unleashing Temporal Capacity of Spiking Neural Networks through Spatiotemporal Separation [67.69345363409835]
Spiking Neural Networks (SNNs) are considered naturally suited for temporal processing, with membrane potential propagation widely regarded as the core temporal modeling mechanism.<n>We design Non-Stateful (NS) models progressively removing membrane propagation to its stage-wise role. Experiments reveal a counterintuitive phenomenon: moderate removal in shallow layers improves performance, while excessive removal causes collapse.
arXiv Detail & Related papers (2025-12-05T07:05:53Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Revisiting Bi-Linear State Transitions in Recurrent Neural Networks [0.3218642352128729]
We show that bi-linear state updates constitute a natural inductive bias for representing the evolution of hidden states in state tracking tasks.<n>We also show that bi-linear state updates form a natural hierarchy corresponding to state tracking tasks of increasing complexity, with popular linear recurrent networks such as Mamba residing at the lowest-complexity center of that hierarchy.
arXiv Detail & Related papers (2025-05-27T20:38:19Z) - 'Memory States' from Almost Nothing: Representing and Computing in a Non-associative Algebra [0.0]
This note presents a non-associative framework for the representation and computation of information items in high-dimensional space.<n>It is consistent with the principles of spatial computing and with the empirical findings in cognitive science about memory.
arXiv Detail & Related papers (2025-05-13T08:43:02Z) - Long Sequence Hopfield Memory [32.28395813801847]
Sequence memory enables agents to encode, store, and retrieve complex sequences of stimuli and actions.
We introduce a nonlinear interaction term, enhancing separation between the patterns.
We extend this model to store sequences with variable timing between states' transitions.
arXiv Detail & Related papers (2023-06-07T15:41:03Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Learning Sequence Representations by Non-local Recurrent Neural Memory [61.65105481899744]
We propose a Non-local Recurrent Neural Memory (NRNM) for supervised sequence representation learning.
Our model is able to capture long-range dependencies and latent high-level features can be distilled by our model.
Our model compares favorably against other state-of-the-art methods specifically designed for each of these sequence applications.
arXiv Detail & Related papers (2022-07-20T07:26:15Z) - Improving Neural Predictivity in the Visual Cortex with Gated Recurrent
Connections [0.0]
We aim to shift the focus on architectures that take into account lateral recurrent connections, a ubiquitous feature of the ventral visual stream, to devise adaptive receptive fields.
In order to increase the robustness of our approach and the biological fidelity of the activations, we employ specific data augmentation techniques.
arXiv Detail & Related papers (2022-03-22T17:27:22Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Tensor Representations for Action Recognition [54.710267354274194]
Human actions in sequences are characterized by the complex interplay between spatial features and their temporal dynamics.
We propose novel tensor representations for capturing higher-order relationships between visual features for the task of action recognition.
We use higher-order tensors and so-called Eigenvalue Power Normalization (NEP) which have been long speculated to perform spectral detection of higher-order occurrences.
arXiv Detail & Related papers (2020-12-28T17:27:18Z) - A Prospective Study on Sequence-Driven Temporal Sampling and Ego-Motion
Compensation for Action Recognition in the EPIC-Kitchens Dataset [68.8204255655161]
Action recognition is one of the top-challenging research fields in computer vision.
ego-motion recorded sequences have become of important relevance.
The proposed method aims to cope with it by estimating this ego-motion or camera motion.
arXiv Detail & Related papers (2020-08-26T14:44:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.