Hypergraph Neural Networks Reveal Spatial Domains from Single-cell Transcriptomics Data
- URL: http://arxiv.org/abs/2410.19868v1
- Date: Wed, 23 Oct 2024 23:32:50 GMT
- Title: Hypergraph Neural Networks Reveal Spatial Domains from Single-cell Transcriptomics Data
- Authors: Mehrad Soltani, Luis Rueda,
- Abstract summary: spatial clustering is important for the classification of tissue samples into diverse subpopulations of cells.
Our model has demonstrated exceptional performance, achieving the highest iLISI score of 1.843 compared to other methods.
Our model outperforms other methods in downstream clustering, achieving the highest ARI values of 0.51 and score of 0.60.
- Score: 0.0
- License:
- Abstract: The task of spatial clustering of transcriptomics data is of paramount importance. It enables the classification of tissue samples into diverse subpopulations of cells, which, in turn, facilitates the analysis of the biological functions of clusters, tissue reconstruction, and cell-cell interactions. Many approaches leverage gene expressions, spatial locations, and histological images to detect spatial domains; however, Graph Neural Networks (GNNs) as state of the art models suffer from a limitation in the assumption of pairwise connections between nodes. In the case of domain detection in spatial transcriptomics, some cells are found to be not directly related. Still, they are grouped as the same domain, which shows the incapability of GNNs for capturing implicit connections among the cells. While graph edges connect only two nodes, hyperedges connect an arbitrary number of nodes along their edges, which lets Hypergraph Neural Networks (HGNNs) capture and utilize richer and more complex structural information than traditional GNNs. We use autoencoders to address the limitation of not having the actual labels, which are well-suited for unsupervised learning. Our model has demonstrated exceptional performance, achieving the highest iLISI score of 1.843 compared to other methods. This score indicates the greatest diversity of cell types identified by our method. Furthermore, our model outperforms other methods in downstream clustering, achieving the highest ARI values of 0.51 and Leiden score of 0.60.
Related papers
- Cell Graph Transformer for Nuclei Classification [78.47566396839628]
We develop a cell graph transformer (CGT) that treats nodes and edges as input tokens to enable learnable adjacency and information exchange among all nodes.
Poorly features can lead to noisy self-attention scores and inferior convergence.
We propose a novel topology-aware pretraining method that leverages a graph convolutional network (GCN) to learn a feature extractor.
arXiv Detail & Related papers (2024-02-20T12:01:30Z) - Population Graph Cross-Network Node Classification for Autism Detection
Across Sample Groups [10.699937593876669]
Cross-network node classification extends GNN techniques to account for domain drift.
We present OTGCN, a powerful, novel approach to cross-network node classification.
We demonstrate the effectiveness of this approach at classifying Autism Spectrum Disorder subjects.
arXiv Detail & Related papers (2024-01-10T18:04:12Z) - Compact & Capable: Harnessing Graph Neural Networks and Edge Convolution
for Medical Image Classification [0.0]
We introduce a novel model that combines GNNs and edge convolution, leveraging the interconnectedness of RGB channel feature values to strongly represent connections between crucial graph nodes.
Our proposed model performs on par with state-of-the-art Deep Neural Networks (DNNs) but does so with 1000 times fewer parameters, resulting in reduced training time and data requirements.
arXiv Detail & Related papers (2023-07-24T13:39:21Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Zero-shot Domain Adaptation of Heterogeneous Graphs via Knowledge
Transfer Networks [72.82524864001691]
heterogeneous graph neural networks (HGNNs) have shown superior performance as powerful representation learning techniques.
There is no direct way to learn using labels rooted at different node types.
In this work, we propose a novel domain adaptation method, Knowledge Transfer Networks for HGNNs (HGNN-KTN)
arXiv Detail & Related papers (2022-03-03T21:00:23Z) - Graph Neural Network for Cell Tracking in Microscopy Videos [0.0]
We present a novel graph neural network (GNN) approach for cell tracking in microscopy videos.
By modeling the entire time-lapse sequence as a direct graph, we extract the entire set of cell trajectories.
We exploit a deep metric learning algorithm to extract cell feature vectors that distinguish between instances of different biological cells.
arXiv Detail & Related papers (2022-02-09T21:21:48Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.