KMS states of Information Flow in Directed Brain Synaptic Networks
- URL: http://arxiv.org/abs/2410.18222v1
- Date: Wed, 23 Oct 2024 18:53:10 GMT
- Title: KMS states of Information Flow in Directed Brain Synaptic Networks
- Authors: El-kaïoum M. Moutuou, Habib Benali,
- Abstract summary: We show that the KMS states of the brain's synaptic network yield global statistical measures of neuronal interactions.
Specifically, we show that the KMS states of this system yield global statistical measures of neuronal interactions.
- Score: 0.3453002745786199
- License:
- Abstract: The brain's synaptic network, characterized by parallel connections and feedback loops, drives information flow between neurons through a large system with infinitely many degrees of freedom. This system is best modeled by the graph $C^*$-algebra of the underlying directed graph, the Toeplitz-Cuntz-Krieger algebra, which captures the diversity of potential information pathways. Coupled with the gauge action, this graph algebra defines an {\em algebraic quantum system}, and here we demonstrate that its thermodynamic properties provide a natural framework for describing the dynamic mappings of information flow within the network. Specifically, we show that the KMS states of this system yield global statistical measures of neuronal interactions, with computational illustrations based on the {\em C. elegans} synaptic network.
Related papers
- Brain functions emerge as thermal equilibrium states of the connectome [0.3453002745786199]
A fundamental paradigm in neuroscience is that cognitive functions are shaped by the brain's structural organization.
We propose a framework in which the functional states of a structural connectome emerge as thermal equilibrium states of an algebraic quantum system.
We apply this framework to the connectome of the nematode em Caenorhabditis elegans.
arXiv Detail & Related papers (2024-08-26T12:35:16Z) - Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation [55.227976642410766]
dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning.
Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks.
We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors.
arXiv Detail & Related papers (2024-05-27T13:36:50Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Information Processing by Neuron Populations in the Central Nervous
System: Mathematical Structure of Data and Operations [0.0]
In the intricate architecture of the mammalian central nervous system, neurons form populations.
These neuron populations' precise encoding and operations have yet to be discovered.
This work illuminates the potential of matrix embeddings in advancing our understanding in fields like cognitive science and AI.
arXiv Detail & Related papers (2023-09-05T15:52:45Z) - DBGDGM: Dynamic Brain Graph Deep Generative Model [63.23390833353625]
Graphs are a natural representation of brain activity derived from functional magnetic imaging (fMRI) data.
It is well known that clusters of anatomical brain regions, known as functional connectivity networks (FCNs), encode temporal relationships which can serve as useful biomarkers for understanding brain function and dysfunction.
Previous works, however, ignore the temporal dynamics of the brain and focus on static graphs.
We propose a dynamic brain graph deep generative model (DBGDGM) which simultaneously clusters brain regions into temporally evolving communities and learns dynamic unsupervised node embeddings.
arXiv Detail & Related papers (2023-01-26T20:45:30Z) - Neuro-symbolic computing with spiking neural networks [0.6035125735474387]
We extend previous work on spike-based graph algorithms by demonstrating how symbolic and multi-relational information can be encoded using spiking neurons.
The introduced framework is enabled by combining the graph embedding paradigm and the recent progress in training spiking neural networks using error backpropagation.
arXiv Detail & Related papers (2022-08-04T10:49:34Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Learning Dynamic Graph Representation of Brain Connectome with
Spatio-Temporal Attention [33.049423523704824]
We propose STAGIN, a method for learning dynamic graph representation of the brain connectome with temporal attention.
Experiments on the HCP-Rest and the HCP-Task datasets demonstrate exceptional performance of our proposed method.
arXiv Detail & Related papers (2021-05-27T23:06:50Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.