A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes
- URL: http://arxiv.org/abs/2112.10583v1
- Date: Fri, 17 Dec 2021 11:47:45 GMT
- Title: A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes
- Authors: Alessandro Benfenati and Alessio Marta
- Abstract summary: We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
- Score: 78.120734120667
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In a previous work, we proposed a geometric framework to study a deep neural
network, seen as sequence of maps between manifolds, employing singular
Riemannian geometry. In this paper, we present an application of this
framework, proposing a way to build the class of equivalence of an input point:
such class is defined as the set of the points on the input manifold mapped to
the same output by the neural network. In other words, we build the preimage of
a point in the output manifold in the input space. In particular. we focus for
simplicity on the case of neural networks maps from n-dimensional real spaces
to (n - 1)-dimensional real spaces, we propose an algorithm allowing to build
the set of points lying on the same class of equivalence. This approach leads
to two main applications: the generation of new synthetic data and it may
provides some insights on how a classifier can be confused by small
perturbation on the input data (e.g. a penguin image classified as an image
containing a chihuahua). In addition, for neural networks from 2D to 1D real
spaces, we also discuss how to find the preimages of closed intervals of the
real line. We also present some numerical experiments with several neural
networks trained to perform non-linear regression tasks, including the case of
a binary classifier.
Related papers
- Geometric Inductive Biases of Deep Networks: The Role of Data and Architecture [22.225213114532533]
We argue that when training a neural network, the input space curvature remains invariant under transformation determined by its architecture.
We show that in cases where the average geometry is low-rank (such as in a ResNet), the geometry only changes in a subset of the input space.
arXiv Detail & Related papers (2024-10-15T19:46:09Z) - Neural networks learn to magnify areas near decision boundaries [32.84188052937496]
We study how training shapes the geometry induced by unconstrained neural network feature maps.
We first show that at infinite width, neural networks with random parameters induce highly symmetric metrics on input space.
This symmetry is broken by feature learning: networks trained to perform classification tasks learn to magnify local areas along decision boundaries.
arXiv Detail & Related papers (2023-01-26T19:43:16Z) - An example of use of Variational Methods in Quantum Machine Learning [0.0]
This paper introduces a quantum neural network for the binary classification of points of a specific geometric pattern on a plane.
The intention was to produce a quantum deep neural network with the minimum number of trainable parameters capable of correctly recognising and classifying points.
arXiv Detail & Related papers (2022-08-07T03:52:42Z) - Sheaf Neural Networks with Connection Laplacians [3.3414557160889076]
Sheaf Neural Network (SNN) is a type of Graph Neural Network (GNN) that operates on a sheaf, an object that equips a graph with vector spaces over its nodes and edges and linear maps between these spaces.
Previous works proposed two diametrically opposed approaches: manually constructing the sheaf based on domain knowledge and learning the sheaf end-to-end using gradient-based methods.
In this work, we propose a novel way of computing sheaves drawing inspiration from Riemannian geometry.
We show that this approach achieves promising results with less computational overhead when compared to previous SNN models.
arXiv Detail & Related papers (2022-06-17T11:39:52Z) - Side-effects of Learning from Low Dimensional Data Embedded in an
Euclidean Space [3.093890460224435]
We study the potential regularization effects associated with the network's depth and noise in needs codimension of the data manifold.
We also present additional side effects in training due to the presence of noise.
arXiv Detail & Related papers (2022-03-01T16:55:51Z) - TSGCNet: Discriminative Geometric Feature Learning with Two-Stream
GraphConvolutional Network for 3D Dental Model Segmentation [141.2690520327948]
We propose a two-stream graph convolutional network (TSGCNet) to learn multi-view information from different geometric attributes.
We evaluate our proposed TSGCNet on a real-patient dataset of dental models acquired by 3D intraoral scanners.
arXiv Detail & Related papers (2020-12-26T08:02:56Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z) - Refinement of Predicted Missing Parts Enhance Point Cloud Completion [62.997667081978825]
Point cloud completion is the task of predicting complete geometry from partial observations using a point set representation for a 3D shape.
Previous approaches propose neural networks to directly estimate the whole point cloud through encoder-decoder models fed by the incomplete point set.
This paper proposes an end-to-end neural network architecture that focuses on computing the missing geometry and merging the known input and the predicted point cloud.
arXiv Detail & Related papers (2020-10-08T22:01:23Z) - Neural Subdivision [58.97214948753937]
This paper introduces Neural Subdivision, a novel framework for data-driven coarseto-fine geometry modeling.
We optimize for the same set of network weights across all local mesh patches, thus providing an architecture that is not constrained to a specific input mesh, fixed genus, or category.
We demonstrate that even when trained on a single high-resolution mesh our method generates reasonable subdivisions for novel shapes.
arXiv Detail & Related papers (2020-05-04T20:03:21Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.