Extended Graph Assessment Metrics for Graph Neural Networks
- URL: http://arxiv.org/abs/2307.10112v2
- Date: Tue, 19 Sep 2023 08:29:02 GMT
- Title: Extended Graph Assessment Metrics for Graph Neural Networks
- Authors: Tamara T. Mueller, Sophie Starck, Leonhard F. Feiner,
Kyriaki-Margarita Bintsi, Daniel Rueckert, Georgios Kaissis
- Abstract summary: We introduce extended graph assessment metrics (GAMs) for regression tasks and continuous adjacency matrices.
We show the correlation of these metrics with model performance on different medical population graphs and under different learning settings.
- Score: 13.49677006107642
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: When re-structuring patient cohorts into so-called population graphs,
initially independent data points can be incorporated into one interconnected
graph structure. This population graph can then be used for medical downstream
tasks using graph neural networks (GNNs). The construction of a suitable graph
structure is a challenging step in the learning pipeline that can have severe
impact on model performance. To this end, different graph assessment metrics
have been introduced to evaluate graph structures. However, these metrics are
limited to classification tasks and discrete adjacency matrices, only covering
a small subset of real-world applications. In this work, we introduce extended
graph assessment metrics (GAMs) for regression tasks and continuous adjacency
matrices. We focus on two GAMs in specific: \textit{homophily} and
\textit{cross-class neighbourhood similarity} (CCNS). We extend the notion of
GAMs to more than one hop, define homophily for regression tasks, as well as
continuous adjacency matrices, and propose a light-weight CCNS distance for
discrete and continuous adjacency matrices. We show the correlation of these
metrics with model performance on different medical population graphs and under
different learning settings.
Related papers
- Graph Neural Networks with a Distribution of Parametrized Graphs [27.40566674759208]
We introduce latent variables to parameterize and generate multiple graphs.
We obtain the maximum likelihood estimate of the network parameters in an Expectation-Maximization framework.
arXiv Detail & Related papers (2023-10-25T06:38:24Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Latent-Graph Learning for Disease Prediction [44.26665239213658]
We show that it is possible to learn a single, optimal graph towards the GCN's downstream task of disease classification.
Unlike commonly employed spectral GCN approaches, our GCN is spatial and inductive, and can thus infer previously unseen patients as well.
arXiv Detail & Related papers (2020-03-27T08:18:01Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z) - Differentiable Graph Module (DGM) for Graph Convolutional Networks [44.26665239213658]
Differentiable Graph Module (DGM) is a learnable function that predicts edge probabilities in the graph which are optimal for the downstream task.
We provide an extensive evaluation of applications from the domains of healthcare (disease prediction), brain imaging (age prediction), computer graphics (3D point cloud segmentation), and computer vision (zero-shot learning)
arXiv Detail & Related papers (2020-02-11T12:59:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.