Signal Processing in the Retina: Interpretable Graph Classifier to
Predict Ganglion Cell Responses
- URL: http://arxiv.org/abs/2401.01813v1
- Date: Wed, 3 Jan 2024 16:15:22 GMT
- Title: Signal Processing in the Retina: Interpretable Graph Classifier to
Predict Ganglion Cell Responses
- Authors: Yasaman Parhizkar, Gene Cheung, Andrew W. Eckford
- Abstract summary: We learn an interpretable graph-based classifier to predict the firings of ganglion cells in response to visual stimuli.
Our framework can be applied to other biological systems with pre-chosen features that require interpretation.
- Score: 26.403303281303092
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: It is a popular hypothesis in neuroscience that ganglion cells in the retina
are activated by selectively detecting visual features in an observed scene.
While ganglion cell firings can be predicted via data-trained deep neural nets,
the networks remain indecipherable, thus providing little understanding of the
cells' underlying operations. To extract knowledge from the cell firings, in
this paper we learn an interpretable graph-based classifier from data to
predict the firings of ganglion cells in response to visual stimuli.
Specifically, we learn a positive semi-definite (PSD) metric matrix $\mathbf{M}
\succeq 0$ that defines Mahalanobis distances between graph nodes (visual
events) endowed with pre-computed feature vectors; the computed inter-node
distances lead to edge weights and a combinatorial graph that is amenable to
binary classification. Mathematically, we define the objective of metric matrix
$\mathbf{M}$ optimization using a graph adaptation of large margin nearest
neighbor (LMNN), which is rewritten as a semi-definite programming (SDP)
problem. We solve it efficiently via a fast approximation called Gershgorin
disc perfect alignment (GDPA) linearization. The learned metric matrix
$\mathbf{M}$ provides interpretability: important features are identified along
$\mathbf{M}$'s diagonal, and their mutual relationships are inferred from
off-diagonal terms. Our fast metric learning framework can be applied to other
biological systems with pre-chosen features that require interpretation.
Related papers
- Cell Graph Transformer for Nuclei Classification [78.47566396839628]
We develop a cell graph transformer (CGT) that treats nodes and edges as input tokens to enable learnable adjacency and information exchange among all nodes.
Poorly features can lead to noisy self-attention scores and inferior convergence.
We propose a novel topology-aware pretraining method that leverages a graph convolutional network (GCN) to learn a feature extractor.
arXiv Detail & Related papers (2024-02-20T12:01:30Z) - Extended Graph Assessment Metrics for Graph Neural Networks [13.49677006107642]
We introduce extended graph assessment metrics (GAMs) for regression tasks and continuous adjacency matrices.
We show the correlation of these metrics with model performance on different medical population graphs and under different learning settings.
arXiv Detail & Related papers (2023-07-13T13:55:57Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - From Latent Graph to Latent Topology Inference: Differentiable Cell
Complex Module [21.383018558790674]
Differentiable Cell Complex Module (DCM) is a novel learnable function that computes cell probabilities in the complex to improve the downstream task.
We show how to integrate DCM with cell complex message passing networks layers and train it in a end-to-end fashion.
Our model is tested on several homophilic and heterophilic graph datasets and it is shown to outperform other state-of-the-art techniques.
arXiv Detail & Related papers (2023-05-25T15:33:19Z) - Tree Mover's Distance: Bridging Graph Metrics and Stability of Graph
Neural Networks [54.225220638606814]
We propose a pseudometric for attributed graphs, the Tree Mover's Distance (TMD), and study its relation to generalization.
First, we show that TMD captures properties relevant to graph classification; a simple TMD-SVM performs competitively with standard GNNs.
Second, we relate TMD to generalization of GNNs under distribution shifts, and show that it correlates well with performance drop under such shifts.
arXiv Detail & Related papers (2022-10-04T21:03:52Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Graph Neural Network for Cell Tracking in Microscopy Videos [0.0]
We present a novel graph neural network (GNN) approach for cell tracking in microscopy videos.
By modeling the entire time-lapse sequence as a direct graph, we extract the entire set of cell trajectories.
We exploit a deep metric learning algorithm to extract cell feature vectors that distinguish between instances of different biological cells.
arXiv Detail & Related papers (2022-02-09T21:21:48Z) - Diff2Dist: Learning Spectrally Distinct Edge Functions, with
Applications to Cell Morphology Analysis [4.133143218285944]
We present a method for learning "spectrally descriptive" edge weights for graphs.
We generalize a previously known distance measure on graphs (Graph Diffusion Distance)
We also demonstrate a further application of this method to biological image analysis.
arXiv Detail & Related papers (2021-06-29T20:40:22Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.