Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks
- URL: http://arxiv.org/abs/2204.07163v1
- Date: Tue, 5 Apr 2022 17:13:36 GMT
- Title: Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks
- Authors: Connor Bybee, Alexander Belsten, Friedrich T. Sommer
- Abstract summary: Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
- Score: 69.42260428921436
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: An open problem in neuroscience is to explain the functional role of
oscillations in neural networks, contributing, for example, to perception,
attention, and memory. Cross-frequency coupling (CFC) is associated with
information integration across populations of neurons. Impaired CFC is linked
to neurological disease. It is unclear what role CFC has in information
processing and brain functional connectivity. We construct a model of CFC which
predicts a computational role for observed $\theta - \gamma$ oscillatory
circuits in the hippocampus and cortex. Our model predicts that the complex
dynamics in recurrent and feedforward networks of coupled oscillators performs
robust information storage and pattern retrieval. Based on phasor associative
memories (PAM), we present a novel oscillator neural network (ONN) model that
includes subharmonic injection locking (SHIL) and which reproduces experimental
observations of CFC. We show that the presence of CFC increases the memory
capacity of a population of neurons connected by plastic synapses. CFC enables
error-free pattern retrieval whereas pattern retrieval fails without CFC. In
addition, the trade-offs between sparse connectivity, capacity, and information
per connection are identified. The associative memory is based on a
complex-valued neural network, or phasor neural network (PNN). We show that for
values of $Q$ which are the same as the ratio of $\gamma$ to $\theta$
oscillations observed in the hippocampus and the cortex, the associative memory
achieves greater capacity and information storage than previous models. The
novel contributions of this work are providing a computational framework based
on oscillator dynamics which predicts the functional role of neural
oscillations and connecting concepts in neural network theory and dynamical
system theory.
Related papers
- NeuroPath: A Neural Pathway Transformer for Joining the Dots of Human Connectomes [4.362614418491178]
We introduce the concept of topological detour to characterize how a ubiquitous instance of FC is supported by neural pathways (detour) physically wired by SC.
In the clich'e of machine learning, the multi-hop detour pathway underlying SC-FC coupling allows us to devise a novel multi-head self-attention mechanism.
We propose a biological-inspired deep model, coined as NeuroPath, to find putative connectomic feature representations from the unprecedented amount of neuroimages.
arXiv Detail & Related papers (2024-09-26T03:40:12Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Spiking representation learning for associative memories [0.0]
We introduce a novel artificial spiking neural network (SNN) that performs unsupervised representation learning and associative memory operations.
The architecture of our model derives from the neocortical columnar organization and combines feedforward projections for learning hidden representations and recurrent projections for forming associative memories.
arXiv Detail & Related papers (2024-06-05T08:30:11Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Self-Evolutionary Reservoir Computer Based on Kuramoto Model [1.7072337666116733]
As a biologically inspired neural network, reservoir computing (RC) has unique advantages in processing information.
We propose a structural autonomous development reservoir computing model (sad-RC), which structure can adapt to the specific problem at hand without any human expert knowledge.
arXiv Detail & Related papers (2023-01-25T15:53:39Z) - The Predictive Forward-Forward Algorithm [79.07468367923619]
We propose the predictive forward-forward (PFF) algorithm for conducting credit assignment in neural systems.
We design a novel, dynamic recurrent neural system that learns a directed generative circuit jointly and simultaneously with a representation circuit.
PFF efficiently learns to propagate learning signals and updates synapses with forward passes only.
arXiv Detail & Related papers (2023-01-04T05:34:48Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - A Graph Neural Network Framework for Causal Inference in Brain Networks [0.3392372796177108]
A central question in neuroscience is how self-organizing dynamic interactions in the brain emerge on their relatively static backbone.
We present a graph neural network (GNN) framework to describe functional interactions based on structural anatomical layout.
We show that GNNs are able to capture long-term dependencies in data and also scale up to the analysis of large-scale networks.
arXiv Detail & Related papers (2020-10-14T15:01:21Z) - Spatio-Temporal Graph Convolution for Resting-State fMRI Analysis [11.85489505372321]
We train a-temporal graph convolutional network (ST-GCN) on short sub-sequences of the BOLD time series to model the non-stationary nature of functional connectivity.
St-GCN is significantly more accurate than common approaches in predicting gender and age based on BOLD signals.
arXiv Detail & Related papers (2020-03-24T01:56:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.