Localized Persistent Homologies for more Effective Deep Learning
- URL: http://arxiv.org/abs/2110.06295v1
- Date: Tue, 12 Oct 2021 19:28:39 GMT
- Title: Localized Persistent Homologies for more Effective Deep Learning
- Authors: Doruk Oner, Ad\'elie Garin, Mateusz Kozi\'nski, Kathryn Hess, Pascal
Fua
- Abstract summary: We introduce an approach that relies on a new filtration function to account for location during network training.
We demonstrate experimentally on 2D images of roads and 3D image stacks of neuronal processes that networks trained in this manner are better at recovering the topology of the curvilinear structures they extract.
- Score: 60.78456721890412
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Persistent Homologies have been successfully used to increase the performance
of deep networks trained to detect curvilinear structures and to improve the
topological quality of the results. However, existing methods are very global
and ignore the location of topological features. In this paper, we introduce an
approach that relies on a new filtration function to account for location
during network training. We demonstrate experimentally on 2D images of roads
and 3D image stacks of neuronal processes that networks trained in this manner
are better at recovering the topology of the curvilinear structures they
extract.
Related papers
- Active Learning of Deep Neural Networks via Gradient-Free Cutting Planes [40.68266398473983]
In this work, we investigate an active learning scheme via a novel cutting-plane method for ReLULU networks of arbitrary depth.
We demonstrate that these algorithms can be extended to deep neural networks despite their non-linear convergence.
We exemplify the effectiveness of our proposed active learning method against popular deep active learning baselines via both data experiments and classification on real datasets.
arXiv Detail & Related papers (2024-10-03T02:11:35Z) - Provable Guarantees for Nonlinear Feature Learning in Three-Layer Neural
Networks [49.808194368781095]
We show that three-layer neural networks have provably richer feature learning capabilities than two-layer networks.
This work makes progress towards understanding the provable benefit of three-layer neural networks over two-layer networks in the feature learning regime.
arXiv Detail & Related papers (2023-05-11T17:19:30Z) - When Deep Learning Meets Polyhedral Theory: A Survey [6.899761345257773]
In the past decade, deep became the prevalent methodology for predictive modeling thanks to the remarkable accuracy of deep neural learning.
Meanwhile, the structure of neural networks converged back to simplerwise and linear functions.
arXiv Detail & Related papers (2023-04-29T11:46:53Z) - GraphCSPN: Geometry-Aware Depth Completion via Dynamic GCNs [49.55919802779889]
We propose a Graph Convolution based Spatial Propagation Network (GraphCSPN) as a general approach for depth completion.
In this work, we leverage convolution neural networks as well as graph neural networks in a complementary way for geometric representation learning.
Our method achieves the state-of-the-art performance, especially when compared in the case of using only a few propagation steps.
arXiv Detail & Related papers (2022-10-19T17:56:03Z) - Optimization-Based Separations for Neural Networks [57.875347246373956]
We show that gradient descent can efficiently learn ball indicator functions using a depth 2 neural network with two layers of sigmoidal activations.
This is the first optimization-based separation result where the approximation benefits of the stronger architecture provably manifest in practice.
arXiv Detail & Related papers (2021-12-04T18:07:47Z) - Dynamic Analysis of Nonlinear Civil Engineering Structures using
Artificial Neural Network with Adaptive Training [2.1202971527014287]
In this study, artificial neural networks are developed with adaptive training algorithms.
The networks can successfully predict the time-history response of the shear frame and the rock structure to real ground motion records.
arXiv Detail & Related papers (2021-11-21T21:14:48Z) - Adversarial Domain Feature Adaptation for Bronchoscopic Depth Estimation [111.89519571205778]
In this work, we propose an alternative domain-adaptive approach to depth estimation.
Our novel two-step structure first trains a depth estimation network with labeled synthetic images in a supervised manner.
The results of our experiments show that the proposed method improves the network's performance on real images by a considerable margin.
arXiv Detail & Related papers (2021-09-24T08:11:34Z) - Advances in the training, pruning and enforcement of shape constraints
of Morphological Neural Networks using Tropical Algebra [40.327435646554115]
We study neural networks based on the morphological operators of dilation and erosion.
Our contributions include the training of morphological networks via Difference-of-Convex programming methods and extend a binary morphological to multiclass tasks.
arXiv Detail & Related papers (2020-11-15T22:44:25Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.