Long Sequence Hopfield Memory
- URL: http://arxiv.org/abs/2306.04532v2
- Date: Thu, 2 Nov 2023 14:55:03 GMT
- Title: Long Sequence Hopfield Memory
- Authors: Hamza Tahir Chaudhry, Jacob A. Zavatone-Veth, Dmitry Krotov, Cengiz
Pehlevan
- Abstract summary: Sequence memory enables agents to encode, store, and retrieve complex sequences of stimuli and actions.
We introduce a nonlinear interaction term, enhancing separation between the patterns.
We extend this model to store sequences with variable timing between states' transitions.
- Score: 32.28395813801847
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequence memory is an essential attribute of natural and artificial
intelligence that enables agents to encode, store, and retrieve complex
sequences of stimuli and actions. Computational models of sequence memory have
been proposed where recurrent Hopfield-like neural networks are trained with
temporally asymmetric Hebbian rules. However, these networks suffer from
limited sequence capacity (maximal length of the stored sequence) due to
interference between the memories. Inspired by recent work on Dense Associative
Memories, we expand the sequence capacity of these models by introducing a
nonlinear interaction term, enhancing separation between the patterns. We
derive novel scaling laws for sequence capacity with respect to network size,
significantly outperforming existing scaling laws for models based on
traditional Hopfield networks, and verify these theoretical results with
numerical simulation. Moreover, we introduce a generalized pseudoinverse rule
to recall sequences of highly correlated patterns. Finally, we extend this
model to store sequences with variable timing between states' transitions and
describe a biologically-plausible implementation, with connections to motor
neuroscience.
Related papers
- Dense Associative Memory Through the Lens of Random Features [48.17520168244209]
Dense Associative Memories are high storage capacity variants of the Hopfield networks.
We show that this novel network closely approximates the energy function and dynamics of conventional Dense Associative Memories.
arXiv Detail & Related papers (2024-10-31T17:10:57Z) - Explosive neural networks via higher-order interactions in curved statistical manifolds [43.496401697112695]
We introduce curved neural networks as a class of prototypical models for studying higher-order phenomena.
We show that these curved neural networks implement a self-regulating process that can accelerate memory retrieval.
arXiv Detail & Related papers (2024-08-05T09:10:29Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Energy-based General Sequential Episodic Memory Networks at the
Adiabatic Limit [3.5450828190071655]
We introduce a new class of General Sequential Episodic Memory Models (GSEMM)
The dynamic energy surface is enabled by newly introduced asymmetric synapses with signal propagation delays in the network's hidden layer.
We show that DSEM has a storage capacity that grows exponentially with the number of neurons in the network.
arXiv Detail & Related papers (2022-12-11T18:09:34Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Learning Sequence Representations by Non-local Recurrent Neural Memory [61.65105481899744]
We propose a Non-local Recurrent Neural Memory (NRNM) for supervised sequence representation learning.
Our model is able to capture long-range dependencies and latent high-level features can be distilled by our model.
Our model compares favorably against other state-of-the-art methods specifically designed for each of these sequence applications.
arXiv Detail & Related papers (2022-07-20T07:26:15Z) - Quantum associative memory with a single driven-dissipative nonlinear
oscillator [0.0]
We propose a realization of associative memory with a single driven-dissipative quantum oscillator.
The model can improve the storage capacity of discrete neuron-based systems in a large regime.
We show that the associative-memory capacity is inherently related to the existence of a spectral gap in the Liouvillian superoperator.
arXiv Detail & Related papers (2022-05-19T12:00:35Z) - Neural Computing with Coherent Laser Networks [0.0]
We show that a coherent network of lasers exhibits emergent neural computing capabilities.
A novel energy-based recurrent neural network handles continuous data as opposed to Hopfield networks and Boltzmann machines.
arXiv Detail & Related papers (2022-04-05T13:56:34Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.