Diff2Dist: Learning Spectrally Distinct Edge Functions, with
Applications to Cell Morphology Analysis
- URL: http://arxiv.org/abs/2106.15716v1
- Date: Tue, 29 Jun 2021 20:40:22 GMT
- Title: Diff2Dist: Learning Spectrally Distinct Edge Functions, with
Applications to Cell Morphology Analysis
- Authors: Cory Braker Scott, Eric Mjolsness, Diane Oyen, Chie Kodera, David
Bouchez, and Magalie Uyttewaal
- Abstract summary: We present a method for learning "spectrally descriptive" edge weights for graphs.
We generalize a previously known distance measure on graphs (Graph Diffusion Distance)
We also demonstrate a further application of this method to biological image analysis.
- Score: 4.133143218285944
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a method for learning "spectrally descriptive" edge weights for
graphs. We generalize a previously known distance measure on graphs (Graph
Diffusion Distance), thereby allowing it to be tuned to minimize an arbitrary
loss function. Because all steps involved in calculating this modified GDD are
differentiable, we demonstrate that it is possible for a small neural network
model to learn edge weights which minimize loss. GDD alone does not effectively
discriminate between graphs constructed from shoot apical meristem images of
wild-type vs. mutant \emph{Arabidopsis thaliana} specimens. However, training
edge weights and kernel parameters with contrastive loss produces a learned
distance metric with large margins between these graph categories. We
demonstrate this by showing improved performance of a simple
k-nearest-neighbors classifier on the learned distance matrix. We also
demonstrate a further application of this method to biological image analysis:
once trained, we use our model to compute the distance between the biological
graphs and a set of graphs output by a cell division simulator. This allows us
to identify simulation parameter regimes which are similar to each class of
graph in our original dataset.
Related papers
- Learning Cartesian Product Graphs with Laplacian Constraints [10.15283812819547]
We study the problem of learning Cartesian product graphs under Laplacian constraints.
We establish statistical consistency for the penalized maximum likelihood estimation.
We also extend our method for efficient joint graph learning and imputation in the presence of structural missing values.
arXiv Detail & Related papers (2024-02-12T22:48:30Z) - Signal Processing in the Retina: Interpretable Graph Classifier to
Predict Ganglion Cell Responses [26.403303281303092]
We learn an interpretable graph-based classifier to predict the firings of ganglion cells in response to visual stimuli.
Our framework can be applied to other biological systems with pre-chosen features that require interpretation.
arXiv Detail & Related papers (2024-01-03T16:15:22Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - Towards Accurate Subgraph Similarity Computation via Neural Graph
Pruning [22.307526272085024]
In this work, we convert graph pruning to a problem of node relabeling and then relax it to a differentiable problem.
Based on this idea, we further design a novel neural network to approximate a type of subgraph distance: the subgraph edit distance (SED)
In the design of the model, we propose an attention mechanism to leverage the information about the query graph and guide the pruning of the target graph.
arXiv Detail & Related papers (2022-10-19T15:16:28Z) - Template based Graph Neural Network with Optimal Transport Distances [11.56532171513328]
Current Graph Neural Networks (GNN) architectures rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling.
We propose in this work a novel point of view, which places distances to some learnable graph templates at the core of the graph representation.
This distance embedding is constructed thanks to an optimal transport distance: the Fused Gromov-Wasserstein (FGW) distance.
arXiv Detail & Related papers (2022-05-31T12:24:01Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Learnable Graph Matching: Incorporating Graph Partitioning with Deep
Feature Learning for Multiple Object Tracking [58.30147362745852]
Data association across frames is at the core of Multiple Object Tracking (MOT) task.
Existing methods mostly ignore the context information among tracklets and intra-frame detections.
We propose a novel learnable graph matching method to address these issues.
arXiv Detail & Related papers (2021-03-30T08:58:45Z) - Detection of Alzheimer's Disease Using Graph-Regularized Convolutional
Neural Network Based on Structural Similarity Learning of Brain Magnetic
Resonance Images [3.478478232710667]
This paper presents an Alzheimer's disease (AD) detection method based on learning structural similarity between Magnetic Resonance Images (MRIs)
We construct the similarity graph using embedded features of the input image (i.e., Non-Demented (ND), Very Mild Demented (VMD), Mild Demented (MD), and Moderated Demented (MDTD))
arXiv Detail & Related papers (2021-02-25T14:49:50Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.