Heterogeneous Graph Convolutional Neural Network via Hodge-Laplacian for
Brain Functional Data
- URL: http://arxiv.org/abs/2302.09323v1
- Date: Sat, 18 Feb 2023 12:58:50 GMT
- Title: Heterogeneous Graph Convolutional Neural Network via Hodge-Laplacian for
Brain Functional Data
- Authors: Jinghan Huang, Moo K. Chung, Anqi Qiu
- Abstract summary: This study proposes a novel heterogeneous graph convolutional neural network (HGCNN) to handle complex brain fMRI data.
We introduce a generic formulation of spectral filters on heterogeneous graphs by introducing the $k-th$lacian (HL) operator.
We design HL-node, HL-edge, and HL-HGCNN neural networks to learn signal representation at a graph node, edge levels, and both, respectively.
- Score: 4.80657982213439
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study proposes a novel heterogeneous graph convolutional neural network
(HGCNN) to handle complex brain fMRI data at regional and across-region levels.
We introduce a generic formulation of spectral filters on heterogeneous graphs
by introducing the $k-th$ Hodge-Laplacian (HL) operator. In particular, we
propose Laguerre polynomial approximations of HL spectral filters and prove
that their spatial localization on graphs is related to the polynomial order.
Furthermore, based on the bijection property of boundary operators on simplex
graphs, we introduce a generic topological graph pooling (TGPool) method that
can be used at any dimensional simplices. This study designs HL-node, HL-edge,
and HL-HGCNN neural networks to learn signal representation at a graph node,
edge levels, and both, respectively. Our experiments employ fMRI from the
Adolescent Brain Cognitive Development (ABCD; n=7693) to predict general
intelligence. Our results demonstrate the advantage of the HL-edge network over
the HL-node network when functional brain connectivity is considered as
features. The HL-HGCNN outperforms the state-of-the-art graph neural networks
(GNNs) approaches, such as GAT, BrainGNN, dGCN, BrainNetCNN, and Hypergraph NN.
The functional connectivity features learned from the HL-HGCNN are meaningful
in interpreting neural circuits related to general intelligence.
Related papers
- Compact & Capable: Harnessing Graph Neural Networks and Edge Convolution
for Medical Image Classification [0.0]
We introduce a novel model that combines GNNs and edge convolution, leveraging the interconnectedness of RGB channel feature values to strongly represent connections between crucial graph nodes.
Our proposed model performs on par with state-of-the-art Deep Neural Networks (DNNs) but does so with 1000 times fewer parameters, resulting in reduced training time and data requirements.
arXiv Detail & Related papers (2023-07-24T13:39:21Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Parameter Convex Neural Networks [13.42851919291587]
We propose the exponential multilayer neural network (EMLP) which is convex with regard to the parameters of the neural network under some conditions.
For late experiments, we use the same architecture to make the exponential graph convolutional network (EGCN) and do the experiment on the graph classificaion dataset.
arXiv Detail & Related papers (2022-06-11T16:44:59Z) - Walking Out of the Weisfeiler Leman Hierarchy: Graph Learning Beyond
Message Passing [4.272016212825404]
We propose CRaWl, a novel neural network architecture for graph learning.
CRaWl operates fundamentally different from message passing graph neural networks.
We prove that the expressiveness of CRaWl is incomparable with that of the Weisfeiler Leman algorithm.
arXiv Detail & Related papers (2021-02-17T14:28:41Z) - Enhance Information Propagation for Graph Neural Network by
Heterogeneous Aggregations [7.3136594018091134]
Graph neural networks are emerging as continuation of deep learning success w.r.t. graph data.
We propose to enhance information propagation among GNN layers by combining heterogeneous aggregations.
We empirically validate the effectiveness of HAG-Net on a number of graph classification benchmarks.
arXiv Detail & Related papers (2021-02-08T08:57:56Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Understanding Graph Isomorphism Network for rs-fMRI Functional
Connectivity Analysis [49.05541693243502]
We develop a framework for analyzing fMRI data using the Graph Isomorphism Network (GIN)
One of the important contributions of this paper is the observation that the GIN is a dual representation of convolutional neural network (CNN) in the graph space.
We exploit CNN-based saliency map techniques for the GNN, which we tailor to the proposed GIN with one-hot encoding.
arXiv Detail & Related papers (2020-01-10T23:40:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.