Topological Deep Learning: Classification Neural Networks
- URL: http://arxiv.org/abs/2102.08354v1
- Date: Tue, 16 Feb 2021 18:41:09 GMT
- Title: Topological Deep Learning: Classification Neural Networks
- Authors: Mustafa Hajij, Kyle Istvan
- Abstract summary: Topological deep learning is a formalism that is aimed at introducing topological language to deep learning.
We show when the classification problem is possible or not possible in the context of neural networks.
- Score: 0.913755431537592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Topological deep learning is a formalism that is aimed at introducing
topological language to deep learning for the purpose of utilizing the minimal
mathematical structures to formalize problems that arise in a generic deep
learning problem. This is the first of a sequence of articles with the purpose
of introducing and studying this formalism. In this article, we define and
study the classification problem in machine learning in a topological setting.
Using this topological framework, we show when the classification problem is
possible or not possible in the context of neural networks. Finally, we
demonstrate how our topological setting immediately illuminates aspects of this
problem that are not as readily apparent using traditional tools.
Related papers
- Structure of Artificial Neural Networks -- Empirical Investigations [0.0]
Within one decade, Deep Learning overtook the dominating solution methods of countless problems of artificial intelligence.
With a formal definition for structures of neural networks, neural architecture search problems and solution methods can be formulated under a common framework.
Does structure make a difference or can it be chosen arbitrarily?
arXiv Detail & Related papers (2024-10-12T16:13:28Z) - Ontology Embedding: A Survey of Methods, Applications and Resources [54.3453925775069]
Ontologies are widely used for representing domain knowledge and meta data.
One straightforward solution is to integrate statistical analysis and machine learning.
Numerous papers have been published on embedding, but a lack of systematic reviews hinders researchers from gaining a comprehensive understanding of this field.
arXiv Detail & Related papers (2024-06-16T14:49:19Z) - Learning Topological Representations for Deep Image Understanding [8.698159165261542]
We propose novel representations of topological structures in a deep learning framework.
We leverage the mathematical tools from topological data analysis to develop principled methods for better segmentation and uncertainty estimation.
arXiv Detail & Related papers (2024-03-22T17:23:37Z) - Topological Expressivity of ReLU Neural Networks [0.0]
We study the expressivity of ReLU neural networks in the setting of a binary classification problem from a topological perspective.
Results show that deep ReLU neural networks are exponentially more powerful than shallow ones in terms of topological simplification.
arXiv Detail & Related papers (2023-10-17T10:28:00Z) - Topologically Regularized Data Embeddings [15.001598256750619]
We introduce a generic approach based on algebraic topology to incorporate topological prior knowledge into low-dimensional embeddings.
We show that jointly optimizing an embedding loss with such a topological loss function as a regularizer yields embeddings that reflect not only local proximities but also the desired topological structure.
We empirically evaluate the proposed approach on computational efficiency, robustness, and versatility in combination with linear and non-linear dimensionality reduction and graph embedding methods.
arXiv Detail & Related papers (2023-01-09T13:49:47Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges [50.22269760171131]
The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods.
This text is concerned with exposing pre-defined regularities through unified geometric principles.
It provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNNs, GNNs, and Transformers.
arXiv Detail & Related papers (2021-04-27T21:09:51Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z) - A Topological Framework for Deep Learning [0.7310043452300736]
We show that the classification problem in machine learning is always solvable under very mild conditions.
In particular, we show that a softmax classification network acts on an input topological space by a finite sequence of topological moves to achieve the classification task.
arXiv Detail & Related papers (2020-08-31T15:56:42Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.