Recovering Time-Varying Networks From Single-Cell Data
- URL: http://arxiv.org/abs/2410.01853v1
- Date: Tue, 1 Oct 2024 19:18:51 GMT
- Title: Recovering Time-Varying Networks From Single-Cell Data
- Authors: Euxhen Hasanaj, Barnabás Póczos, Ziv Bar-Joseph,
- Abstract summary: We develop a deep neural network, Marlene, to infer dynamic graphs from time series single-cell gene expression data.
Marlene can identify gene interactions relevant to specific biological responses, including COVID-19 immune response, fibrosis, and aging.
- Score: 11.04189396013616
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gene regulation is a dynamic process that underlies all aspects of human development, disease response, and other key biological processes. The reconstruction of temporal gene regulatory networks has conventionally relied on regression analysis, graphical models, or other types of relevance networks. With the large increase in time series single-cell data, new approaches are needed to address the unique scale and nature of this data for reconstructing such networks. Here, we develop a deep neural network, Marlene, to infer dynamic graphs from time series single-cell gene expression data. Marlene constructs directed gene networks using a self-attention mechanism where the weights evolve over time using recurrent units. By employing meta learning, the model is able to recover accurate temporal networks even for rare cell types. In addition, Marlene can identify gene interactions relevant to specific biological responses, including COVID-19 immune response, fibrosis, and aging.
Related papers
- Comparative Analysis of Multi-Omics Integration Using Advanced Graph Neural Networks for Cancer Classification [40.45049709820343]
Multi-omics data integration poses significant challenges due to the high dimensionality, data complexity, and distinct characteristics of various omics types.
This study evaluates three graph neural network architectures for multi-omics (MO) integration based on graph-convolutional networks (GCN), graph-attention networks (GAT), and graph-transformer networks (GTN)
arXiv Detail & Related papers (2024-10-05T16:17:44Z) - Cell reprogramming design by transfer learning of functional
transcriptional networks [0.0]
We develop a transfer learning approach to control cell behavior that is pre-trained on transcriptomic data associated with human cell fates.
We show that the number of gene perturbations required to steer from one fate to another increases with decreasing developmental relatedness.
arXiv Detail & Related papers (2024-03-07T19:00:02Z) - Single-Cell Deep Clustering Method Assisted by Exogenous Gene
Information: A Novel Approach to Identifying Cell Types [50.55583697209676]
We develop an attention-enhanced graph autoencoder, which is designed to efficiently capture the topological features between cells.
During the clustering process, we integrated both sets of information and reconstructed the features of both cells and genes to generate a discriminative representation.
This research offers enhanced insights into the characteristics and distribution of cells, thereby laying the groundwork for early diagnosis and treatment of diseases.
arXiv Detail & Related papers (2023-11-28T09:14:55Z) - DDeMON: Ontology-based function prediction by Deep Learning from Dynamic
Multiplex Networks [0.7349727826230864]
The goal of this work is to explore how the fusion of systems' level information with temporal dynamics of gene expression can be used to predict novel gene functions.
We propose DDeMON, an approach for scalable, systems-level inference of function annotation using time-dependent multiscale biological information.
arXiv Detail & Related papers (2023-02-08T06:53:02Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Physically constrained neural networks to solve the inverse problem for
neuron models [0.29005223064604074]
Systems biology and systems neurophysiology are powerful tools for a number of key applications in the biomedical sciences.
Recent developments in the field of deep neural networks have demonstrated the possibility of formulating nonlinear, universal approximators.
arXiv Detail & Related papers (2022-09-24T12:51:15Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Towards an Automatic Analysis of CHO-K1 Suspension Growth in
Microfluidic Single-cell Cultivation [63.94623495501023]
We propose a novel Machine Learning architecture, which allows us to infuse a neural deep network with human-powered abstraction on the level of data.
Specifically, we train a generative model simultaneously on natural and synthetic data, so that it learns a shared representation, from which a target variable, such as the cell count, can be reliably estimated.
arXiv Detail & Related papers (2020-10-20T08:36:51Z) - Select-ProtoNet: Learning to Select for Few-Shot Disease Subtype
Prediction [55.94378672172967]
We focus on few-shot disease subtype prediction problem, identifying subgroups of similar patients.
We introduce meta learning techniques to develop a new model, which can extract the common experience or knowledge from interrelated clinical tasks.
Our new model is built upon a carefully designed meta-learner, called Prototypical Network, that is a simple yet effective meta learning machine for few-shot image classification.
arXiv Detail & Related papers (2020-09-02T02:50:30Z) - Neural Cellular Automata Manifold [84.08170531451006]
We show that the neural network architecture of the Neural Cellular Automata can be encapsulated in a larger NN.
This allows us to propose a new model that encodes a manifold of NCA, each of them capable of generating a distinct image.
In biological terms, our approach would play the role of the transcription factors, modulating the mapping of genes into specific proteins that drive cellular differentiation.
arXiv Detail & Related papers (2020-06-22T11:41:57Z) - A Graph Feature Auto-Encoder for the Prediction of Unobserved Node
Features on Biological Networks [3.132875765271743]
We studied the representation of biological interaction networks in E. Coli and mouse using graph neural networks.
We proposed a new end-to-end graph feature auto-encoder which is trained on the feature reconstruction task.
Our graph feature auto-encoder outperformed a state-of-the-art imputation method that does not use protein interaction information.
arXiv Detail & Related papers (2020-05-08T11:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.