Energy-based General Sequential Episodic Memory Networks at the
Adiabatic Limit
- URL: http://arxiv.org/abs/2212.05563v1
- Date: Sun, 11 Dec 2022 18:09:34 GMT
- Title: Energy-based General Sequential Episodic Memory Networks at the
Adiabatic Limit
- Authors: Arjun Karuvally, Terry J. Sejnowski, Hava T. Siegelmann
- Abstract summary: We introduce a new class of General Sequential Episodic Memory Models (GSEMM)
The dynamic energy surface is enabled by newly introduced asymmetric synapses with signal propagation delays in the network's hidden layer.
We show that DSEM has a storage capacity that grows exponentially with the number of neurons in the network.
- Score: 3.5450828190071655
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The General Associative Memory Model (GAMM) has a constant state-dependant
energy surface that leads the output dynamics to fixed points, retrieving
single memories from a collection of memories that can be asynchronously
preloaded. We introduce a new class of General Sequential Episodic Memory
Models (GSEMM) that, in the adiabatic limit, exhibit temporally changing energy
surface, leading to a series of meta-stable states that are sequential episodic
memories. The dynamic energy surface is enabled by newly introduced asymmetric
synapses with signal propagation delays in the network's hidden layer. We study
the theoretical and empirical properties of two memory models from the GSEMM
class, differing in their activation functions. LISEM has non-linearities in
the feature layer, whereas DSEM has non-linearity in the hidden layer. In
principle, DSEM has a storage capacity that grows exponentially with the number
of neurons in the network. We introduce a learning rule for the synapses based
on the energy minimization principle and show it can learn single memories and
their sequential relationships online. This rule is similar to the Hebbian
learning algorithm and Spike-Timing Dependent Plasticity (STDP), which describe
conditions under which synapses between neurons change strength. Thus, GSEMM
combines the static and dynamic properties of episodic memory under a single
theoretical framework and bridges neuroscience, machine learning, and
artificial intelligence.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - B'MOJO: Hybrid State Space Realizations of Foundation Models with Eidetic and Fading Memory [91.81390121042192]
We develop a class of models called B'MOJO to seamlessly combine eidetic and fading memory within an composable module.
B'MOJO's ability to modulate eidetic and fading memory results in better inference on longer sequences tested up to 32K tokens.
arXiv Detail & Related papers (2024-07-08T18:41:01Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Memory-and-Anticipation Transformer for Online Action Understanding [52.24561192781971]
We propose a novel memory-anticipation-based paradigm to model an entire temporal structure, including the past, present, and future.
We present Memory-and-Anticipation Transformer (MAT), a memory-anticipation-based approach, to address the online action detection and anticipation tasks.
arXiv Detail & Related papers (2023-08-15T17:34:54Z) - Long Sequence Hopfield Memory [32.28395813801847]
Sequence memory enables agents to encode, store, and retrieve complex sequences of stimuli and actions.
We introduce a nonlinear interaction term, enhancing separation between the patterns.
We extend this model to store sequences with variable timing between states' transitions.
arXiv Detail & Related papers (2023-06-07T15:41:03Z) - Quantum associative memory with a single driven-dissipative nonlinear
oscillator [0.0]
We propose a realization of associative memory with a single driven-dissipative quantum oscillator.
The model can improve the storage capacity of discrete neuron-based systems in a large regime.
We show that the associative-memory capacity is inherently related to the existence of a spectral gap in the Liouvillian superoperator.
arXiv Detail & Related papers (2022-05-19T12:00:35Z) - Hierarchical Associative Memory [2.66512000865131]
Associative Memories or Modern Hopfield Networks have many appealing properties.
They can do pattern completion, store a large number of memories, and can be described using a recurrent neural network.
This paper tackles a gap and describes a fully recurrent model of associative memory with an arbitrary large number of layers.
arXiv Detail & Related papers (2021-07-14T01:38:40Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Enhancing associative memory recall and storage capacity using confocal
cavity QED [15.696215759892052]
We introduce a near-term experimental platform for realizing an associative memory.
It can simultaneously store many memories by using spinful bosons coupled to a multimode optical cavity.
We show that this nonequilibrium quantum-optical scheme has significant advantages for associative memory over Glauber dynamics.
arXiv Detail & Related papers (2020-09-02T17:59:15Z) - Theory of gating in recurrent neural networks [5.672132510411465]
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience.
Here, we show that gating offers flexible control of two salient features of the collective dynamics.
The gate controlling timescales leads to a novel, marginally stable state, where the network functions as a flexible integrator.
arXiv Detail & Related papers (2020-07-29T13:20:58Z) - Incremental Training of a Recurrent Neural Network Exploiting a
Multi-Scale Dynamic Memory [79.42778415729475]
We propose a novel incrementally trained recurrent architecture targeting explicitly multi-scale learning.
We show how to extend the architecture of a simple RNN by separating its hidden state into different modules.
We discuss a training algorithm where new modules are iteratively added to the model to learn progressively longer dependencies.
arXiv Detail & Related papers (2020-06-29T08:35:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.