Fast Topological Clustering with Wasserstein Distance
- URL: http://arxiv.org/abs/2112.00101v1
- Date: Tue, 30 Nov 2021 21:02:53 GMT
- Title: Fast Topological Clustering with Wasserstein Distance
- Authors: Tananun Songdechakraiwut, Bryan M. Krause, Matthew I. Banks, Kirill V.
Nourski and Barry D. Van Veen
- Abstract summary: We propose a novel and computationally practical topological clustering method that clusters complex networks with intricate topology.
Such networks are aggregated into clusters through a centroid-based clustering strategy based on both their topological and geometric structure.
The proposed method is demonstrated to be effective using both simulated networks and measured functional brain networks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The topological patterns exhibited by many real-world networks motivate the
development of topology-based methods for assessing the similarity of networks.
However, extracting topological structure is difficult, especially for large
and dense networks whose node degrees range over multiple orders of magnitude.
In this paper, we propose a novel and computationally practical topological
clustering method that clusters complex networks with intricate topology using
principled theory from persistent homology and optimal transport. Such networks
are aggregated into clusters through a centroid-based clustering strategy based
on both their topological and geometric structure, preserving correspondence
between nodes in different networks. The notions of topological proximity and
centroid are characterized using a novel and efficient approach to computation
of the Wasserstein distance and barycenter for persistence barcodes associated
with connected components and cycles. The proposed method is demonstrated to be
effective using both simulated networks and measured functional brain networks.
Related papers
- Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - DANI: Fast Diffusion Aware Network Inference with Preserving Topological
Structure Property [2.8948274245812327]
We propose a novel method called DANI to infer the underlying network while preserving its structural properties.
DANI has higher accuracy and lower run time while maintaining structural properties, including modular structure, degree distribution, connected components, density, and clustering coefficients.
arXiv Detail & Related papers (2023-10-02T23:23:00Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - A pseudo-likelihood approach to community detection in weighted networks [4.111899441919165]
We propose a pseudo-likelihood community estimation algorithm for networks with normally distributed edge weights.
We prove that the estimates obtained by the proposed method are consistent under the assumption of homogeneous networks.
We illustrate the method on simulated networks and on a fMRI dataset, where edge weights represent connectivity between brain regions.
arXiv Detail & Related papers (2023-03-10T13:36:10Z) - Topological Classification in a Wasserstein Distance Based Vector Space [0.0]
The proposed vector space is based on the Wasserstein distance between persistence barcodes.
The effectiveness of the proposed vector space is demonstrated using support vector machines to classify simulated networks and measured functional brain networks.
arXiv Detail & Related papers (2022-02-02T20:40:57Z) - A Modular Framework for Centrality and Clustering in Complex Networks [0.6423239719448168]
In this paper, we study two important such network analysis techniques, namely, centrality and clustering.
An information-flow based model is adopted for clustering, which itself builds upon an information theoretic measure for computing centrality.
Our clustering naturally inherits the flexibility to accommodate edge directionality, as well as different interpretations and interplay between edge weights and node degrees.
arXiv Detail & Related papers (2021-11-23T03:01:29Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - On the use of local structural properties for improving the efficiency
of hierarchical community detection methods [77.34726150561087]
We study how local structural network properties can be used as proxies to improve the efficiency of hierarchical community detection.
We also check the performance impact of network prunings as an ancillary tactic to make hierarchical community detection more efficient.
arXiv Detail & Related papers (2020-09-15T00:16:12Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Detecting Communities in Heterogeneous Multi-Relational Networks:A
Message Passing based Approach [89.19237792558687]
Community is a common characteristic of networks including social networks, biological networks, computer and information networks.
We propose an efficient message passing based algorithm to simultaneously detect communities for all homogeneous networks.
arXiv Detail & Related papers (2020-04-06T17:36:24Z) - Network Clustering Via Kernel-ARMA Modeling and the Grassmannian The
Brain-Network Case [6.78543866474958]
This paper introduces a clustering framework for networks with nodes annotated with time-series data.
The framework addresses all types of network-clustering problems: state clustering, node clustering within states, and even subnetwork-state-sequence identification/tracking.
arXiv Detail & Related papers (2020-02-18T19:48:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.