Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications
- URL: http://arxiv.org/abs/2204.00323v1
- Date: Fri, 1 Apr 2022 10:01:37 GMT
- Title: Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications
- Authors: Kamilia Mullakaeva, Luca Cosmo, Anees Kazi, Seyed-Ahmad Ahmadi, Nassir
Navab and Michael M. Bronstein
- Abstract summary: Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
- Score: 52.65389473899139
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs are a powerful tool for representing and analyzing unstructured,
non-Euclidean data ubiquitous in the healthcare domain. Two prominent examples
are molecule property prediction and brain connectome analysis. Importantly,
recent works have shown that considering relationships between input data
samples have a positive regularizing effect for the downstream task in
healthcare applications. These relationships are naturally modeled by a
(possibly unknown) graph structure between input samples. In this work, we
propose Graph-in-Graph (GiG), a neural network architecture for protein
classification and brain imaging applications that exploits the graph
representation of the input data samples and their latent relation. We assume
an initially unknown latent-graph structure between graph-valued input data and
propose to learn end-to-end a parametric model for message passing within and
across input graph samples, along with the latent structure connecting the
input graphs. Further, we introduce a degree distribution loss that helps
regularize the predicted latent relationships structure. This regularization
can significantly improve the downstream task. Moreover, the obtained latent
graph can represent patient population models or networks of molecule clusters,
providing a level of interpretability and knowledge discovery in the input
domain of particular value in healthcare.
Related papers
- A Comparative Study of Population-Graph Construction Methods and Graph
Neural Networks for Brain Age Regression [48.97251676778599]
In medical imaging, population graphs have demonstrated promising results, mostly for classification tasks.
extracting population graphs is a non-trivial task and can significantly impact the performance of Graph Neural Networks (GNNs)
In this work, we highlight the importance of a meaningful graph construction and experiment with different population-graph construction methods.
arXiv Detail & Related papers (2023-09-26T10:30:45Z) - Extended Graph Assessment Metrics for Graph Neural Networks [13.49677006107642]
We introduce extended graph assessment metrics (GAMs) for regression tasks and continuous adjacency matrices.
We show the correlation of these metrics with model performance on different medical population graphs and under different learning settings.
arXiv Detail & Related papers (2023-07-13T13:55:57Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Implications of Topological Imbalance for Representation Learning on
Biomedical Knowledge Graphs [16.566710222582618]
We show how knowledge graph embedding models can be affected by structural imbalance.
We show how the graph topology can be perturbed to artificially alter the rank of a gene via random, biologically meaningless information.
arXiv Detail & Related papers (2021-12-13T11:20:36Z) - OOD-GNN: Out-of-Distribution Generalized Graph Neural Network [73.67049248445277]
Graph neural networks (GNNs) have achieved impressive performance when testing and training graph data come from identical distribution.
Existing GNNs lack out-of-distribution generalization abilities so that their performance substantially degrades when there exist distribution shifts between testing and training graph data.
We propose an out-of-distribution generalized graph neural network (OOD-GNN) for achieving satisfactory performance on unseen testing graphs that have different distributions with training graphs.
arXiv Detail & Related papers (2021-12-07T16:29:10Z) - Generating a Doppelganger Graph: Resembling but Distinct [5.618335078130568]
We propose an approach to generating a doppelganger graph that resembles a given one in many graph properties.
The approach is an orchestration of graph representation learning, generative adversarial networks, and graph realization algorithms.
arXiv Detail & Related papers (2021-01-23T22:08:27Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Differentiable Graph Module (DGM) for Graph Convolutional Networks [44.26665239213658]
Differentiable Graph Module (DGM) is a learnable function that predicts edge probabilities in the graph which are optimal for the downstream task.
We provide an extensive evaluation of applications from the domains of healthcare (disease prediction), brain imaging (age prediction), computer graphics (3D point cloud segmentation), and computer vision (zero-shot learning)
arXiv Detail & Related papers (2020-02-11T12:59:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.