Convergence of Manifold Filter-Combine Networks
- URL: http://arxiv.org/abs/2410.14639v1
- Date: Fri, 18 Oct 2024 17:40:58 GMT
- Title: Convergence of Manifold Filter-Combine Networks
- Authors: David R. Johnson, Joyce Chew, Siddharth Viswanath, Edward De Brouwer, Deanna Needell, Smita Krishnaswamy, Michael Perlmutter,
- Abstract summary: We introduce Manifold Filter-Combine Networks (MFCNs) to better understand manifold neural networks (MNNs)
We propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating the manifold by a sparse graph.
We prove that our method is consistent in the sense that it converges to a limit as the number of data points continuum tends to infinity.
- Score: 18.590886216749528
- License:
- Abstract: In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). The filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggests many interesting families of MNNs which can be interpreted as the manifold analog of various popular GNNs. We then propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating the manifold by a sparse graph. We prove that our method is consistent in the sense that it converges to a continuum limit as the number of data points tends to infinity.
Related papers
- Manifold Filter-Combine Networks [22.19399386945317]
We introduce a class of manifold neural networks (MNNs) that we call Manifold Filter-Combine Networks (MFCNs)
This class includes a wide variety of subclasses that can be thought of as the manifold analog of various popular graph neural networks (GNNs)
arXiv Detail & Related papers (2023-07-08T23:19:53Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Higher-order Sparse Convolutions in Graph Neural Networks [17.647346486710514]
We introduce a new higher-order sparse convolution based on the Sobolev norm of graph signals.
S-SobGNN shows competitive performance in all applications as compared to several state-of-the-art methods.
arXiv Detail & Related papers (2023-02-21T08:08:18Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - MGDCF: Distance Learning via Markov Graph Diffusion for Neural
Collaborative Filtering [96.65234340724237]
We show the equivalence between some state-of-the-art GNN-based CF models and a traditional 1-layer NRL model based on context encoding.
We present Markov Graph Diffusion Collaborative Filtering (MGDCF) to generalize some state-of-the-art GNN-based CF models.
arXiv Detail & Related papers (2022-04-05T17:24:32Z) - Understanding the Basis of Graph Convolutional Neural Networks via an
Intuitive Matched Filtering Approach [7.826806223782053]
Graph Convolutional Neural Networks (GCNN) are becoming a preferred model for data processing on irregular domains.
We show that their convolution layers effectively perform matched filtering of input data with the chosen patterns.
A numerical example guides the reader through the various steps of GCNN operation and learning both visually and numerically.
arXiv Detail & Related papers (2021-08-23T12:41:06Z) - Ranking Structured Objects with Graph Neural Networks [0.0]
RankGNNs are trained with a set of pair-wise preferences between graphs, suggesting that one of them is preferred over the other.
One practical application of this problem is drug screening, where an expert wants to find the most promising molecules in a large collection of drug candidates.
We empirically demonstrate that our proposed pair-wise RankGNN approach either significantly outperforms or at least matches the ranking performance of the naive point-wise baseline approach.
arXiv Detail & Related papers (2021-04-18T14:40:59Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.