Functional connectivity modules in recurrent neural networks: function,
origin and dynamics
- URL: http://arxiv.org/abs/2310.20601v1
- Date: Tue, 31 Oct 2023 16:37:01 GMT
- Title: Functional connectivity modules in recurrent neural networks: function,
origin and dynamics
- Authors: Jacob Tanner, Sina Mansour L., Ludovico Coletta, Alessandro Gozzi,
Richard F. Betzel
- Abstract summary: We show that modules form spontaneously from asymmetries in the sign and weight of projections from the input layer to the recurrent layer.
We show that modules define connections with similar roles in governing system behavior and dynamics.
- Score: 41.988864091386766
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Understanding the ubiquitous phenomenon of neural synchronization across
species and organizational levels is crucial for decoding brain function.
Despite its prevalence, the specific functional role, origin, and dynamical
implication of modular structures in correlation-based networks remains
ambiguous. Using recurrent neural networks trained on systems neuroscience
tasks, this study investigates these important characteristics of modularity in
correlation networks. We demonstrate that modules are functionally coherent
units that contribute to specialized information processing. We show that
modules form spontaneously from asymmetries in the sign and weight of
projections from the input layer to the recurrent layer. Moreover, we show that
modules define connections with similar roles in governing system behavior and
dynamics. Collectively, our findings clarify the function, formation, and
operational significance of functional connectivity modules, offering insights
into cortical function and laying the groundwork for further studies on brain
function, development, and dynamics.
Related papers
- Modular Growth of Hierarchical Networks: Efficient, General, and Robust Curriculum Learning [0.0]
We show that for a given classical, non-modular recurrent neural network (RNN), an equivalent modular network will perform better across multiple metrics.
We demonstrate that the inductive bias introduced by the modular topology is strong enough for the network to perform well even when the connectivity within modules is fixed.
Our findings suggest that gradual modular growth of RNNs could provide advantages for learning increasingly complex tasks on evolutionary timescales.
arXiv Detail & Related papers (2024-06-10T13:44:07Z) - Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - DSAM: A Deep Learning Framework for Analyzing Temporal and Spatial Dynamics in Brain Networks [4.041732967881764]
Most rs-fMRI studies compute a single static functional connectivity matrix across brain regions of interest.
These approaches are at risk of oversimplifying brain dynamics and lack proper consideration of the goal at hand.
We propose a novel interpretable deep learning framework that learns goal-specific functional connectivity matrix directly from time series.
arXiv Detail & Related papers (2024-05-19T23:35:06Z) - Astrocytes as a mechanism for meta-plasticity and contextually-guided
network function [2.66269503676104]
Astrocytes are a ubiquitous and enigmatic type of non-neuronal cell.
Astrocytes may play a more direct and active role in brain function and neural computation.
arXiv Detail & Related papers (2023-11-06T20:31:01Z) - Growing Brains: Co-emergence of Anatomical and Functional Modularity in
Recurrent Neural Networks [18.375521792153112]
Recurrent neural networks (RNNs) trained on compositional tasks can exhibit functional modularity.
We apply a recent machine learning method, brain-inspired modular training, to a network being trained to solve a set of compositional cognitive tasks.
We find that functional and anatomical clustering emerge together, such that functionally similar neurons also become spatially localized and interconnected.
arXiv Detail & Related papers (2023-10-11T17:58:25Z) - Emergent Modularity in Pre-trained Transformers [127.08792763817496]
We consider two main characteristics of modularity: functional specialization of neurons and function-based neuron grouping.
We study how modularity emerges during pre-training, and find that the modular structure is stabilized at the early stage.
It suggests that Transformers first construct the modular structure and then learn fine-grained neuron functions.
arXiv Detail & Related papers (2023-05-28T11:02:32Z) - Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems [73.18020682258606]
We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel.
We propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP)
Our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - A Graph Neural Network Framework for Causal Inference in Brain Networks [0.3392372796177108]
A central question in neuroscience is how self-organizing dynamic interactions in the brain emerge on their relatively static backbone.
We present a graph neural network (GNN) framework to describe functional interactions based on structural anatomical layout.
We show that GNNs are able to capture long-term dependencies in data and also scale up to the analysis of large-scale networks.
arXiv Detail & Related papers (2020-10-14T15:01:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.