Functional Connectivity Graph Neural Networks
- URL: http://arxiv.org/abs/2508.05786v1
- Date: Thu, 07 Aug 2025 18:59:02 GMT
- Title: Functional Connectivity Graph Neural Networks
- Authors: Yang Li, Luopeiwen Yi, Tananun Songdechakraiwut,
- Abstract summary: We propose a graph neural network framework that generalizes this approach to other domains.<n>Our method introduces a functional connectivity block based on persistent graph homology to capture global topological features.
- Score: 3.146587279468621
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world networks often benefit from capturing both local and global interactions. Inspired by multi-modal analysis in brain imaging, where structural and functional connectivity offer complementary views of network organization, we propose a graph neural network framework that generalizes this approach to other domains. Our method introduces a functional connectivity block based on persistent graph homology to capture global topological features. Combined with structural information, this forms a multi-modal architecture called Functional Connectivity Graph Neural Networks. Experiments show consistent performance gains over existing methods, demonstrating the value of brain-inspired representations for graph-level classification across diverse networks.
Related papers
- Brain Network Classification Based on Graph Contrastive Learning and Graph Transformer [0.6906005491572401]
This paper proposes a novel model named PHGCL-DDGformer that integrates graph contrastive learning with graph transformers.<n> Experimental results on real-world datasets demonstrate that the PHGCL-DDGformer model outperforms existing state-of-the-art approaches in brain network classification tasks.
arXiv Detail & Related papers (2025-04-01T13:26:03Z) - Topological Neural Networks: Mitigating the Bottlenecks of Graph Neural
Networks via Higher-Order Interactions [1.994307489466967]
This work starts with a theoretical framework to reveal the impact of network's width, depth, and graph topology on the over-squashing phenomena in message-passing neural networks.
The work drifts towards, higher-order interactions and multi-relational inductive biases via Topological Neural Networks.
Inspired by Graph Attention Networks, two topological attention networks are proposed: Simplicial and Cell Attention Networks.
arXiv Detail & Related papers (2024-02-10T08:26:06Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Joint Graph Convolution for Analyzing Brain Structural and Functional
Connectome [11.016035878136034]
We propose to couple the two networks of an individual by adding inter-network edges between corresponding brain regions.
The weights of inter-network edges are learnable, reflecting non-uniform structure-function coupling strength across the brain.
We apply our Joint-GCN to predict age and sex of 662 participants from the public dataset of the National Consortium on Alcohol and Neurodevelopment in Adolescence.
arXiv Detail & Related papers (2022-10-27T23:43:34Z) - Functional2Structural: Cross-Modality Brain Networks Representation
Learning [55.24969686433101]
Graph mining on brain networks may facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
We propose a novel graph learning framework, known as Deep Signed Brain Networks (DSBN), with a signed graph encoder.
We validate our framework on clinical phenotype and neurodegenerative disease prediction tasks using two independent, publicly available datasets.
arXiv Detail & Related papers (2022-05-06T03:45:36Z) - Semi-Supervised Deep Learning for Multiplex Networks [20.671777884219555]
Multiplex networks are complex graph structures in which a set of entities are connected to each other via multiple types of relations.
We present a novel semi-supervised approach for structure-aware representation learning on multiplex networks.
arXiv Detail & Related papers (2021-10-05T13:37:43Z) - Hierarchical Graph Neural Networks [0.0]
This paper aims to connect the dots between the traditional Neural Network and the Graph Neural Network architectures.
A Hierarchical Graph Neural Network architecture is proposed, supplementing the original input network layer with the hierarchy of auxiliary network layers.
It enables simultaneous learning of the individual node features along with the aggregated network features at variable resolution and uses them to improve the convergence and stability of the individual node feature learning.
arXiv Detail & Related papers (2021-05-07T16:47:18Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.