Tangent Bundle Filters and Neural Networks: from Manifolds to Cellular
Sheaves and Back
- URL: http://arxiv.org/abs/2210.15058v1
- Date: Wed, 26 Oct 2022 21:55:45 GMT
- Title: Tangent Bundle Filters and Neural Networks: from Manifolds to Cellular
Sheaves and Back
- Authors: Claudio Battiloro, Zhiyang Wang, Hans Riess, Paolo Di Lorenzo,
Alejandro Ribeiro
- Abstract summary: We use the convolution to define tangent bundle filters and tangent bundle neural networks (TNNs)
We discretize TNNs both in space and time domains, showing that their discrete counterpart is a principled variant of the recently introduced Sheaf Neural Networks.
We numerically evaluate the effectiveness of the proposed architecture on a denoising task of a tangent vector field over the unit 2-sphere.
- Score: 114.01902073621577
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work we introduce a convolution operation over the tangent bundle of
Riemannian manifolds exploiting the Connection Laplacian operator. We use the
convolution to define tangent bundle filters and tangent bundle neural networks
(TNNs), novel continuous architectures operating on tangent bundle signals,
i.e. vector fields over manifolds. We discretize TNNs both in space and time
domains, showing that their discrete counterpart is a principled variant of the
recently introduced Sheaf Neural Networks. We formally prove that this discrete
architecture converges to the underlying continuous TNN. We numerically
evaluate the effectiveness of the proposed architecture on a denoising task of
a tangent vector field over the unit 2-sphere.
Related papers
- Bundle Neural Networks for message diffusion on graphs [10.018379001231356]
We show that Bundle Neural Networks (BuNNs) can approximate any feature transformation over nodes on any graphs given injective positional encodings.
We also prove that BuNNs can approximate any feature transformation over nodes on any family of graphs given injective positional encodings, resulting in universal node-level expressivity.
arXiv Detail & Related papers (2024-05-24T13:28:48Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Resolution-Invariant Image Classification based on Fourier Neural
Operators [1.3190581566723918]
We investigate the use of generalization Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs)
We derive the FNO architecture as an example for continuous and Fr'echet-differentiable neural operators on Lebesgue spaces.
arXiv Detail & Related papers (2023-04-02T10:23:36Z) - Tangent Bundle Convolutional Learning: from Manifolds to Cellular Sheaves and Back [84.61160272624262]
We define tangent bundle filters and tangent bundle neural networks (TNNs) based on this convolution operation.
Tangent bundle filters admit a spectral representation that generalizes the ones of scalar manifold filters, graph filters and standard convolutional filters in continuous time.
We numerically evaluate the effectiveness of the proposed architecture on various learning tasks.
arXiv Detail & Related papers (2023-03-20T17:57:15Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Sheaf Neural Networks with Connection Laplacians [3.3414557160889076]
Sheaf Neural Network (SNN) is a type of Graph Neural Network (GNN) that operates on a sheaf, an object that equips a graph with vector spaces over its nodes and edges and linear maps between these spaces.
Previous works proposed two diametrically opposed approaches: manually constructing the sheaf based on domain knowledge and learning the sheaf end-to-end using gradient-based methods.
In this work, we propose a novel way of computing sheaves drawing inspiration from Riemannian geometry.
We show that this approach achieves promising results with less computational overhead when compared to previous SNN models.
arXiv Detail & Related papers (2022-06-17T11:39:52Z) - Two-layer neural networks with values in a Banach space [1.90365714903665]
We study two-layer neural networks whose domain and range are Banach spaces with separable preduals.
As the nonlinearity we choose the lattice operation of taking the positive part; in case of $mathbb Rd$-valued neural networks this corresponds to the ReLU activation function.
arXiv Detail & Related papers (2021-05-05T14:54:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.