Convolutional Learning on Simplicial Complexes
- URL: http://arxiv.org/abs/2301.11163v1
- Date: Thu, 26 Jan 2023 15:08:11 GMT
- Title: Convolutional Learning on Simplicial Complexes
- Authors: Maosheng Yang and Elvin Isufi
- Abstract summary: We propose a simplicial complex convolutional neural network (SCCNN) to learn data representations on simplicial complexes.
It performs convolutions based on the multi-hop simplicial adjacencies via common faces and cofaces independently.
- Score: 13.604803091781926
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a simplicial complex convolutional neural network (SCCNN) to learn
data representations on simplicial complexes. It performs convolutions based on
the multi-hop simplicial adjacencies via common faces and cofaces independently
and captures the inter-simplicial couplings, generalizing state-of-the-art.
Upon studying symmetries of the simplicial domain and the data space, it is
shown to be permutation and orientation equivariant, thus, incorporating such
inductive biases. Based on the Hodge theory, we perform a spectral analysis to
understand how SCCNNs regulate data in different frequencies, showing that the
convolutions via faces and cofaces operate in two orthogonal data spaces.
Lastly, we study the stability of SCCNNs to domain deformations and examine the
effects of various factors. Empirical results show the benefits of higher-order
convolutions and inter-simplicial couplings in simplex prediction and
trajectory prediction.
Related papers
- COSMOS: Continuous Simplicial Neural Networks [8.970917790622263]
We introduce COntinuous SiMplicial neural netwOrkS (COSMOS), a novel SNN architecture derived from partial differential equations (PDEs) on simplicial complexes.
We show that COSMOS achieves competitive performance compared to state-of-the-art SNNs in complex and noisy environments.
arXiv Detail & Related papers (2025-03-17T08:31:25Z) - Higher-Order Topological Directionality and Directed Simplicial Neural Networks [12.617840099457066]
We introduce a novel notion of higher-order directionality and we design Directed Simplicial Neural Networks (Dir-SNNs) based on it.
Dir-SNNs are message-passing networks operating on directed simplicial complexes.
Experiments on a synthetic source localization task demonstrate that Dir-SNNs outperform undirected SNNs when the underlying complex is directed.
arXiv Detail & Related papers (2024-09-12T20:37:14Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - SC-MAD: Mixtures of Higher-order Networks for Data Augmentation [36.33265644447091]
The simplicial complex has inspired generalizations of graph neural networks (GNNs) to simplicial complex-based models.
We propose data augmentation of simplicial complexes through both linear and nonlinear mixup mechanisms.
We theoretically demonstrate that the resultant synthetic simplicial complexes interpolate among existing data with respect to homomorphism densities.
arXiv Detail & Related papers (2023-09-14T06:25:39Z) - Generalized Simplicial Attention Neural Networks [22.171364354867723]
We introduce Generalized Simplicial Attention Neural Networks (GSANs)
GSANs process data living on simplicial complexes using masked self-attentional layers.
These schemes learn how to combine data associated with neighbor simplices of consecutive order in a task-oriented fashion.
arXiv Detail & Related papers (2023-09-05T11:29:25Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Tensor-based Multi-view Spectral Clustering via Shared Latent Space [14.470859959783995]
Multi-view Spectral Clustering (MvSC) attracts increasing attention due to diverse data sources.
New method for MvSC is proposed via a shared latent space from the Restricted Kernel Machine framework.
arXiv Detail & Related papers (2022-07-23T17:30:54Z) - coVariance Neural Networks [119.45320143101381]
Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning.
We propose a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs.
We show that VNN performance is indeed more stable than PCA-based statistical approaches.
arXiv Detail & Related papers (2022-05-31T15:04:43Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Deformation Robust Roto-Scale-Translation Equivariant CNNs [10.44236628142169]
Group-equivariant convolutional neural networks (G-CNNs) achieve significantly improved generalization performance with intrinsic symmetry.
General theory and practical implementation of G-CNNs have been studied for planar images under either rotation or scaling transformation.
arXiv Detail & Related papers (2021-11-22T03:58:24Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.