Network Representation Learning for Biophysical Neural Network Analysis
- URL: http://arxiv.org/abs/2410.11503v1
- Date: Tue, 15 Oct 2024 11:16:54 GMT
- Title: Network Representation Learning for Biophysical Neural Network Analysis
- Authors: Youngmok Ha, Yongjoo Kim, Hyun Jae Jang, Seungyeon Lee, Eunji Pak,
- Abstract summary: We introduce a novel BNN analysis framework grounded in network representation learning (NRL)
Our framework integrates a new computational graph (CG)-based BNN representation, a bio-inspired graph attention network (BGAN)
BGAN reflects the compositional structure of neurons, including dendrites, somas, and axons, as well as bidirectional information flows between BNN components.
- Score: 0.7262345640500065
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The analysis of biophysical neural networks (BNNs) has been a longstanding focus in computational neuroscience. A central yet unresolved challenge in BNN analysis lies in deciphering the correlations between neuronal and synaptic dynamics, their connectivity patterns, and learning process. To address this, we introduce a novel BNN analysis framework grounded in network representation learning (NRL), which leverages attention scores to uncover intricate correlations between network components and their features. Our framework integrates a new computational graph (CG)-based BNN representation, a bio-inspired graph attention network (BGAN) that enables multiscale correlation analysis across BNN representations, and an extensive BNN dataset. The CG-based representation captures key computational features, information flow, and structural relationships underlying neuronal and synaptic dynamics, while BGAN reflects the compositional structure of neurons, including dendrites, somas, and axons, as well as bidirectional information flows between BNN components. The dataset comprises publicly available models from ModelDB, reconstructed using the Python and standardized in NeuroML format, and is augmented with data derived from canonical neuron and synapse models. To our knowledge, this study is the first to apply an NRL-based approach to the full spectrum of BNNs and their analysis.
Related papers
- Spiking World Model with Multi-Compartment Neurons for Model-based Reinforcement Learning [6.0483672878162515]
Brain-inspired spiking neural networks (SNNs) have garnered significant research attention in algorithm design and perception applications.
However, their potential in the decision-making domain, particularly in model-based reinforcement learning, remains underexplored.
We propose a multi-compartment neuron model capable of nonlinearly integrating information from multiple dendritic sources to dynamically process long sequential inputs.
arXiv Detail & Related papers (2025-03-02T03:40:10Z) - Delay Neural Networks (DeNN) for exploiting temporal information in event-based datasets [49.1574468325115]
Delay Neural Networks (DeNN) are designed to explicitly use exact continuous temporal information of spikes in both forward and backward passes.
Good performances are obtained, especially for datasets where temporal information is important.
arXiv Detail & Related papers (2025-01-10T14:58:15Z) - Deep Neural Networks via Complex Network Theory: a Perspective [3.1023851130450684]
Deep Neural Networks (DNNs) can be represented as graphs whose links and vertices iteratively process data and solve tasks sub-optimally. Complex Network Theory (CNT), merging statistical physics with graph theory, provides a method for interpreting neural networks by analysing their weights and neuron structures.
In this work, we extend the existing CNT metrics with measures that sample from the DNNs' training distribution, shifting from a purely topological analysis to one that connects with the interpretability of deep learning.
arXiv Detail & Related papers (2024-04-17T08:42:42Z) - Transferability of coVariance Neural Networks and Application to
Interpretable Brain Age Prediction using Anatomical Features [119.45320143101381]
Graph convolutional networks (GCN) leverage topology-driven graph convolutional operations to combine information across the graph for inference tasks.
We have studied GCNs with covariance matrices as graphs in the form of coVariance neural networks (VNNs)
VNNs inherit the scale-free data processing architecture from GCNs and here, we show that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.
arXiv Detail & Related papers (2023-05-02T22:15:54Z) - Multi-compartment Neuron and Population Encoding improved Spiking Neural
Network for Deep Distributional Reinforcement Learning [3.036382664997076]
Spiking neural networks (SNNs) exhibit significant low energy consumption and are more suitable for incorporating multi-scale biological characteristics.
In this paper, we propose a brain-inspired SNN-based deep distributional reinforcement learning algorithm with combination of bio-inspired multi-compartment neuron (MCN) model and population coding method.
arXiv Detail & Related papers (2023-01-18T02:45:38Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - On the Study of Sample Complexity for Polynomial Neural Networks [13.265045615849099]
Among various kinds of neural networks architectures, sample neural networks (PNNs) have been recently shown to be analyzable by spectrum analysis.
In this paper, we extend the analysis in previous literature to PNNs and obtain novel results on sample complexity of PNNs.
arXiv Detail & Related papers (2022-07-18T19:10:53Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Modeling Spatio-Temporal Dynamics in Brain Networks: A Comparison of
Graph Neural Network Architectures [0.5033155053523041]
Graph neural networks (GNNs) provide a possibility to interpret new structured graph signals.
We show that by learning localized functional interactions on the substrate, GNN based approaches are able to robustly scale to large network studies.
arXiv Detail & Related papers (2021-12-08T12:57:13Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z) - Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and
Perspective [8.047921724008278]
We review the state-of-the-art on the use of GNNs as a model of neural-symbolic computing.
This includes the application of GNNs in several domains as well as its relationship to current developments in neural-symbolic computing.
arXiv Detail & Related papers (2020-02-29T18:55:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.