Manifold Filter-Combine Networks
- URL: http://arxiv.org/abs/2307.04056v3
- Date: Tue, 5 Sep 2023 18:14:55 GMT
- Title: Manifold Filter-Combine Networks
- Authors: Joyce Chew and Edward De Brouwer and Smita Krishnaswamy and Deanna
Needell and Michael Perlmutter
- Abstract summary: We introduce a class of manifold neural networks (MNNs) that we call Manifold Filter-Combine Networks (MFCNs)
This class includes a wide variety of subclasses that can be thought of as the manifold analog of various popular graph neural networks (GNNs)
- Score: 22.19399386945317
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a class of manifold neural networks (MNNs) that we call Manifold
Filter-Combine Networks (MFCNs), that aims to further our understanding of
MNNs, analogous to how the aggregate-combine framework helps with the
understanding of graph neural networks (GNNs). This class includes a wide
variety of subclasses that can be thought of as the manifold analog of various
popular GNNs. We then consider a method, based on building a data-driven graph,
for implementing such networks when one does not have global knowledge of the
manifold, but merely has access to finitely many sample points. We provide
sufficient conditions for the network to provably converge to its continuum
limit as the number of sample points tends to infinity. Unlike previous work
(which focused on specific graph constructions), our rate of convergence does
not directly depend on the number of filters used. Moreover, it exhibits linear
dependence on the depth of the network rather than the exponential dependence
obtained previously. Additionally, we provide several examples of interesting
subclasses of MFCNs and of the rates of convergence that are obtained under
specific graph constructions.
Related papers
- Convergence of Manifold Filter-Combine Networks [18.590886216749528]
We introduce Manifold Filter-Combine Networks (MFCNs) to better understand manifold neural networks (MNNs)
We propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating the manifold by a sparse graph.
We prove that our method is consistent in the sense that it converges to a limit as the number of data points continuum tends to infinity.
arXiv Detail & Related papers (2024-10-18T17:40:58Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - A Convergence Rate for Manifold Neural Networks [6.428026202398116]
We introduce a method for constructing manifold neural networks using the spectral decomposition of the Laplace Beltrami operator.
We build upon this result by establishing a rate of convergence that depends on the intrinsic dimension of the manifold.
We also discuss how the rate of convergence depends on the depth of the network and the number of filters used in each layer.
arXiv Detail & Related papers (2022-12-23T22:44:25Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Understanding the Basis of Graph Convolutional Neural Networks via an
Intuitive Matched Filtering Approach [7.826806223782053]
Graph Convolutional Neural Networks (GCNN) are becoming a preferred model for data processing on irregular domains.
We show that their convolution layers effectively perform matched filtering of input data with the chosen patterns.
A numerical example guides the reader through the various steps of GCNN operation and learning both visually and numerically.
arXiv Detail & Related papers (2021-08-23T12:41:06Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.