Topological Learning in Multi-Class Data Sets
- URL: http://arxiv.org/abs/2301.09734v4
- Date: Thu, 8 Feb 2024 17:16:39 GMT
- Title: Topological Learning in Multi-Class Data Sets
- Authors: Christopher Griffin and Trevor Karn and Benjamin Apple
- Abstract summary: We study the impact of topological complexity on learning in feedforward deep neural networks (DNNs)
We evaluate our topological classification algorithm on multiple constructed and open source data sets.
- Score: 0.3050152425444477
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We specialize techniques from topological data analysis to the problem of
characterizing the topological complexity (as defined in the body of the paper)
of a multi-class data set. As a by-product, a topological classifier is defined
that uses an open sub-covering of the data set. This sub-covering can be used
to construct a simplicial complex whose topological features (e.g., Betti
numbers) provide information about the classification problem. We use these
topological constructs to study the impact of topological complexity on
learning in feedforward deep neural networks (DNNs). We hypothesize that
topological complexity is negatively correlated with the ability of a fully
connected feedforward deep neural network to learn to classify data correctly.
We evaluate our topological classification algorithm on multiple constructed
and open source data sets. We also validate our hypothesis regarding the
relationship between topological complexity and learning in DNN's on multiple
data sets.
Related papers
- Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - The Computational Complexity of Concise Hypersphere Classification [49.57441416941195]
This paper is the first complexity-theoretic study of the hypersphere classification problem for binary data.
We use the fine-grained parameterized complexity paradigm to analyze the impact of structural properties that may be present in the input data.
arXiv Detail & Related papers (2023-12-12T09:33:03Z) - Image Classification using Combination of Topological Features and
Neural Networks [1.0323063834827417]
We use the persistent homology method, a technique in topological data analysis (TDA), to extract essential topological features from the data space.
This was carried out with the aim of classifying images from multiple classes in the MNIST dataset.
Our approach inserts topological features into deep learning approaches composed by single and two-streams neural networks.
arXiv Detail & Related papers (2023-11-10T20:05:40Z) - On Characterizing the Evolution of Embedding Space of Neural Networks
using Algebraic Topology [9.537910170141467]
We study how the topology of feature embedding space changes as it passes through the layers of a well-trained deep neural network (DNN) through Betti numbers.
We demonstrate that as depth increases, a topologically complicated dataset is transformed into a simple one, resulting in Betti numbers attaining their lowest possible value.
arXiv Detail & Related papers (2023-11-08T10:45:12Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - Unsupervised hierarchical clustering using the learning dynamics of RBMs [0.0]
We present a new and general method for building relational data trees by exploiting the learning dynamics of the Restricted Boltzmann Machine (RBM)
Our method is based on the mean-field approach, derived from the Plefka expansion, and developed in context of disordered systems.
We tested our method in an artificially hierarchical dataset and on three different real-world datasets (images of digits, mutations in the human genome, and a family of proteins)
arXiv Detail & Related papers (2023-02-03T16:53:32Z) - Rethinking Persistent Homology for Visual Recognition [27.625893409863295]
This paper performs a detailed analysis of the effectiveness of topological properties for image classification in various training scenarios.
We identify the scenarios that benefit the most from topological features, e.g., training simple networks on small datasets.
arXiv Detail & Related papers (2022-07-09T08:01:11Z) - Topological Deep Learning: Going Beyond Graph Data [26.325857542512047]
We present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains.
Specifically, we first introduce complexes, a novel type of topological domain.
We develop a class of message-passing complex neural networks (CCNNs) focusing primarily on attention-based CCNNs.
arXiv Detail & Related papers (2022-06-01T16:21:28Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Dive into Layers: Neural Network Capacity Bounding using Algebraic
Geometry [55.57953219617467]
We show that the learnability of a neural network is directly related to its size.
We use Betti numbers to measure the topological geometric complexity of input data and the neural network.
We perform the experiments on a real-world dataset MNIST and the results verify our analysis and conclusion.
arXiv Detail & Related papers (2021-09-03T11:45:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.