Exponential Dynamic Energy Network for High Capacity Sequence Memory
- URL: http://arxiv.org/abs/2510.24965v1
- Date: Tue, 28 Oct 2025 20:53:24 GMT
- Title: Exponential Dynamic Energy Network for High Capacity Sequence Memory
- Authors: Arjun Karuvally, Pichsinee Lertsaroj, Terrence J. Sejnowski, Hava T. Siegelmann,
- Abstract summary: We introduce the Exponential Dynamic Energy Network (EDEN), a novel architecture that extends the energy paradigm to temporal domains.<n>EDEN offers a scalable and interpretable model for high-capacity temporal memory in both artificial and biological systems.
- Score: 11.724565818034948
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The energy paradigm, exemplified by Hopfield networks, offers a principled framework for memory in neural systems by interpreting dynamics as descent on an energy surface. While powerful for static associative memories, it falls short in modeling sequential memory, where transitions between memories are essential. We introduce the Exponential Dynamic Energy Network (EDEN), a novel architecture that extends the energy paradigm to temporal domains by evolving the energy function over multiple timescales. EDEN combines a static high-capacity energy network with a slow, asymmetrically interacting modulatory population, enabling robust and controlled memory transitions. We formally derive short-timescale energy functions that govern local dynamics and use them to analytically compute memory escape times, revealing a phase transition between static and dynamic regimes. The analysis of capacity, defined as the number of memories that can be stored with minimal error rate as a function of the dimensions of the state space (number of feature neurons), for EDEN shows that it achieves exponential sequence memory capacity $O(\gamma^N)$, outperforming the linear capacity $O(N)$ of conventional models. Furthermore, EDEN's dynamics resemble the activity of time and ramping cells observed in the human brain during episodic memory tasks, grounding its biological relevance. By unifying static and sequential memory within a dynamic energy framework, EDEN offers a scalable and interpretable model for high-capacity temporal memory in both artificial and biological systems.
Related papers
- The AI Hippocampus: How Far are We From Human Memory? [77.04745635827278]
Implicit memory refers to the knowledge embedded within the internal parameters of pre-trained transformers.<n>Explicit memory involves external storage and retrieval components designed to augment model outputs with dynamic, queryable knowledge representations.<n>Agentic memory introduces persistent, temporally extended memory structures within autonomous agents.
arXiv Detail & Related papers (2026-01-14T03:24:08Z) - MemGen: Weaving Generative Latent Memory for Self-Evolving Agents [57.1835920227202]
We propose MemGen, a dynamic generative memory framework that equips agents with a human-esque cognitive faculty.<n>MemGen enables agents to recall and augment latent memory throughout reasoning, producing a tightly interwoven cycle of memory and cognition.
arXiv Detail & Related papers (2025-09-29T12:33:13Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Latent Structured Hopfield Network for Semantic Association and Retrieval [52.634915010996835]
Episodic memory enables humans to recall past experiences by associating semantic elements such as objects, locations, and time into coherent event representations.<n>We propose the Latent Structured Hopfield Network (LSHN), a framework that integrates continuous Hopfield attractor dynamics into an autoencoder architecture.<n>Unlike traditional Hopfield networks, our model is trained end-to-end with gradient descent, achieving scalable and robust memory retrieval.
arXiv Detail & Related papers (2025-06-02T04:24:36Z) - Modern Hopfield Networks with Continuous-Time Memories [19.616624959353697]
We propose an approach that compresses large discrete Hopfield memories into smaller, continuous-time memories.<n>Inspired by psychological theories of continuous neural resource allocation in working memory, we propose an approach that compresses large discrete Hopfield memories into smaller, continuous-time memories.
arXiv Detail & Related papers (2025-02-14T12:41:05Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Estimation of Energy-dissipation Lower-bounds for Neuromorphic Learning-in-memory [5.073292775065559]
An ideal neuromorphic neurally-inspired neurally-equilibriums rely on local but parallel parameter updates to solve problems that range from quadratic programming to Ising machines.<n>An analysis presented in this paper captures the out-of- thermodynamics of learning and the resulting energy-efficiency estimates are model-agnostic.<n>To show practical applicability of our results, we apply our analysis for estimating the lower-bound on the energy-to-solution metrics for large-scale AI workloads.
arXiv Detail & Related papers (2024-02-21T21:02:11Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Energy-based General Sequential Episodic Memory Networks at the
Adiabatic Limit [3.5450828190071655]
We introduce a new class of General Sequential Episodic Memory Models (GSEMM)
The dynamic energy surface is enabled by newly introduced asymmetric synapses with signal propagation delays in the network's hidden layer.
We show that DSEM has a storage capacity that grows exponentially with the number of neurons in the network.
arXiv Detail & Related papers (2022-12-11T18:09:34Z) - Slow manifolds in recurrent networks encode working memory efficiently
and robustly [0.0]
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time.
We use a top-down modeling approach to examine network-level mechanisms of working memory.
arXiv Detail & Related papers (2021-01-08T18:47:02Z) - Enhancing associative memory recall and storage capacity using confocal
cavity QED [15.696215759892052]
We introduce a near-term experimental platform for realizing an associative memory.
It can simultaneously store many memories by using spinful bosons coupled to a multimode optical cavity.
We show that this nonequilibrium quantum-optical scheme has significant advantages for associative memory over Glauber dynamics.
arXiv Detail & Related papers (2020-09-02T17:59:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.