Bayesian Topological Learning for Classifying the Structure of
Biological Networks
- URL: http://arxiv.org/abs/2009.11974v1
- Date: Thu, 24 Sep 2020 22:43:03 GMT
- Title: Bayesian Topological Learning for Classifying the Structure of
Biological Networks
- Authors: Vasileios Maroulas, Cassie Putman Micucci, and Farzana Nasrin
- Abstract summary: Actin cytoskeleton networks generate local topological signatures due to the natural variations in the number, size, and shape of holes of the networks.
Persistent homology is a method that explores these topological properties of data and summarizes them as persistence diagrams.
We implement a Bayes factor algorithm to classify the actin filament networks and benchmark it against several state-of-the-art classification methods.
- Score: 1.644043499620662
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Actin cytoskeleton networks generate local topological signatures due to the
natural variations in the number, size, and shape of holes of the networks.
Persistent homology is a method that explores these topological properties of
data and summarizes them as persistence diagrams. In this work, we analyze and
classify these filament networks by transforming them into persistence diagrams
whose variability is quantified via a Bayesian framework on the space of
persistence diagrams. The proposed generalized Bayesian framework adopts an
independent and identically distributed cluster point process characterization
of persistence diagrams and relies on a substitution likelihood argument. This
framework provides the flexibility to estimate the posterior cardinality
distribution of points in a persistence diagram and the posterior spatial
distribution simultaneously. We present a closed form of the posteriors under
the assumption of Gaussian mixtures and binomials for prior intensity and
cardinality respectively. Using this posterior calculation, we implement a
Bayes factor algorithm to classify the actin filament networks and benchmark it
against several state-of-the-art classification methods.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - A rank decomposition for the topological classification of neural representations [0.0]
In this work, we leverage the fact that neural networks are equivalent to continuous piecewise-affine maps.
We study the homology groups of the quotient of a manifold $mathcalM$ and a subset $A$, assuming some minimal properties on these spaces.
We show that in randomly narrow networks, there will be regions in which the (co)homology groups of a data manifold can change.
arXiv Detail & Related papers (2024-04-30T17:01:20Z) - Bayesian Unsupervised Disentanglement of Anatomy and Geometry for Deep Groupwise Image Registration [50.62725807357586]
This article presents a general Bayesian learning framework for multi-modal groupwise image registration.
We propose a novel hierarchical variational auto-encoding architecture to realise the inference procedure of the latent variables.
Experiments were conducted to validate the proposed framework, including four different datasets from cardiac, brain, and abdominal medical images.
arXiv Detail & Related papers (2024-01-04T08:46:39Z) - DANI: Fast Diffusion Aware Network Inference with Preserving Topological
Structure Property [2.8948274245812327]
We propose a novel method called DANI to infer the underlying network while preserving its structural properties.
DANI has higher accuracy and lower run time while maintaining structural properties, including modular structure, degree distribution, connected components, density, and clustering coefficients.
arXiv Detail & Related papers (2023-10-02T23:23:00Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Bayesian Networks for Named Entity Prediction in Programming Community
Question Answering [0.0]
We propose a new approach for natural language processing using Bayesian networks to predict and analyze the context.
We compare the Bayesian networks with different score metrics, such as the BIC, BDeu, K2 and Chow-Liu trees.
In addition, we examine the visualization of directed acyclic graphs to analyze semantic relationships.
arXiv Detail & Related papers (2023-02-26T07:26:36Z) - $k$-Means Clustering for Persistent Homology [0.0]
We prove convergence of the $k$-means clustering algorithm on persistence diagram space.
We also establish theoretical properties of the solution to the optimization problem in the Karush--Kuhn--Tucker framework.
arXiv Detail & Related papers (2022-10-18T17:18:51Z) - Entangled Residual Mappings [59.02488598557491]
We introduce entangled residual mappings to generalize the structure of the residual connections.
An entangled residual mapping replaces the identity skip connections with specialized entangled mappings.
We show that while entangled mappings can preserve the iterative refinement of features across various deep models, they influence the representation learning process in convolutional networks.
arXiv Detail & Related papers (2022-06-02T19:36:03Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.