Image Classification using Combination of Topological Features and
Neural Networks
- URL: http://arxiv.org/abs/2311.06375v1
- Date: Fri, 10 Nov 2023 20:05:40 GMT
- Title: Image Classification using Combination of Topological Features and
Neural Networks
- Authors: Mariana D\'oria Prata Lima, Gilson Antonio Giraldi, Gast\~ao
Flor\^encio Miranda Junior
- Abstract summary: We use the persistent homology method, a technique in topological data analysis (TDA), to extract essential topological features from the data space.
This was carried out with the aim of classifying images from multiple classes in the MNIST dataset.
Our approach inserts topological features into deep learning approaches composed by single and two-streams neural networks.
- Score: 1.0323063834827417
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work we use the persistent homology method, a technique in
topological data analysis (TDA), to extract essential topological features from
the data space and combine them with deep learning features for classification
tasks. In TDA, the concepts of complexes and filtration are building blocks.
Firstly, a filtration is constructed from some complex. Then, persistent
homology classes are computed, and their evolution along the filtration is
visualized through the persistence diagram. Additionally, we applied
vectorization techniques to the persistence diagram to make this topological
information compatible with machine learning algorithms. This was carried out
with the aim of classifying images from multiple classes in the MNIST dataset.
Our approach inserts topological features into deep learning approaches
composed by single and two-streams neural networks architectures based on a
multi-layer perceptron (MLP) and a convolutional neral network (CNN) taylored
for multi-class classification in the MNIST dataset. In our analysis, we
evaluated the obtained results and compared them with the outcomes achieved
through the baselines that are available in the TensorFlow library. The main
conclusion is that topological information may increase neural network accuracy
in multi-class classification tasks with the price of computational complexity
of persistent homology calculation. Up to the best of our knowledge, it is the
first work that combines deep learning features and the combination of
topological features for multi-class classification tasks.
Related papers
- Informed deep hierarchical classification: a non-standard analysis inspired approach [0.0]
It consists in a multi-output deep neural network equipped with specific projection operators placed before each output layer.
The design of such an architecture, called lexicographic hybrid deep neural network (LH-DNN), has been possible by combining tools from different and quite distant research fields.
To assess the efficacy of the approach, the resulting network is compared against the B-CNN, a convolutional neural network tailored for hierarchical classification tasks.
arXiv Detail & Related papers (2024-09-25T14:12:50Z) - Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - On Characterizing the Evolution of Embedding Space of Neural Networks
using Algebraic Topology [9.537910170141467]
We study how the topology of feature embedding space changes as it passes through the layers of a well-trained deep neural network (DNN) through Betti numbers.
We demonstrate that as depth increases, a topologically complicated dataset is transformed into a simple one, resulting in Betti numbers attaining their lowest possible value.
arXiv Detail & Related papers (2023-11-08T10:45:12Z) - Topological Learning in Multi-Class Data Sets [0.3050152425444477]
We study the impact of topological complexity on learning in feedforward deep neural networks (DNNs)
We evaluate our topological classification algorithm on multiple constructed and open source data sets.
arXiv Detail & Related papers (2023-01-23T21:54:25Z) - Experimental Observations of the Topology of Convolutional Neural
Network Activations [2.4235626091331737]
Topological data analysis provides compact, noise-robust representations of complex structures.
Deep neural networks (DNNs) learn millions of parameters associated with a series of transformations defined by the model architecture.
In this paper, we apply cutting edge techniques from TDA with the goal of gaining insight into the interpretability of convolutional neural networks used for image classification.
arXiv Detail & Related papers (2022-12-01T02:05:44Z) - Rethinking Persistent Homology for Visual Recognition [27.625893409863295]
This paper performs a detailed analysis of the effectiveness of topological properties for image classification in various training scenarios.
We identify the scenarios that benefit the most from topological features, e.g., training simple networks on small datasets.
arXiv Detail & Related papers (2022-07-09T08:01:11Z) - Do We Really Need a Learnable Classifier at the End of Deep Neural
Network? [118.18554882199676]
We study the potential of learning a neural network for classification with the classifier randomly as an ETF and fixed during training.
Our experimental results show that our method is able to achieve similar performances on image classification for balanced datasets.
arXiv Detail & Related papers (2022-03-17T04:34:28Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Connecting Weighted Automata, Tensor Networks and Recurrent Neural
Networks through Spectral Learning [58.14930566993063]
We present connections between three models used in different research fields: weighted finite automata(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks.
We introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous vectors input.
arXiv Detail & Related papers (2020-10-19T15:28:00Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.