SC-MAD: Mixtures of Higher-order Networks for Data Augmentation
- URL: http://arxiv.org/abs/2309.07453v1
- Date: Thu, 14 Sep 2023 06:25:39 GMT
- Title: SC-MAD: Mixtures of Higher-order Networks for Data Augmentation
- Authors: Madeline Navarro, Santiago Segarra
- Abstract summary: The simplicial complex has inspired generalizations of graph neural networks (GNNs) to simplicial complex-based models.
We propose data augmentation of simplicial complexes through both linear and nonlinear mixup mechanisms.
We theoretically demonstrate that the resultant synthetic simplicial complexes interpolate among existing data with respect to homomorphism densities.
- Score: 36.33265644447091
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The myriad complex systems with multiway interactions motivate the extension
of graph-based pairwise connections to higher-order relations. In particular,
the simplicial complex has inspired generalizations of graph neural networks
(GNNs) to simplicial complex-based models. Learning on such systems requires
large amounts of data, which can be expensive or impossible to obtain. We
propose data augmentation of simplicial complexes through both linear and
nonlinear mixup mechanisms that return mixtures of existing labeled samples. In
addition to traditional pairwise mixup, we present a convex clustering mixup
approach for a data-driven relationship among several simplicial complexes. We
theoretically demonstrate that the resultant synthetic simplicial complexes
interpolate among existing data with respect to homomorphism densities. Our
method is demonstrated on both synthetic and real-world datasets for simplicial
complex classification.
Related papers
- Decomposing heterogeneous dynamical systems with graph neural networks [0.16492989697868887]
We show that graph neural networks can be designed to jointly learn the interaction rules and the structure of the heterogeneous system.
The learned latent structure and dynamics can be used to virtually decompose the complex system.
arXiv Detail & Related papers (2024-07-27T04:03:12Z) - Spectral Convergence of Complexon Shift Operators [38.89310649097387]
We study the transferability of Topological Signal Processing via a generalized higher-order version of graphon, known as complexon.
Inspired by the graphon shift operator and message-passing neural network, we construct a marginal complexon and complexon shift operator.
We prove that when a simplicial complex signal sequence converges to a complexon signal, the eigenvalues, eigenspaces, and Fourier transform of the corresponding CSOs converge to that of the limit complexon signal.
arXiv Detail & Related papers (2023-09-12T08:40:20Z) - Generalized Simplicial Attention Neural Networks [22.171364354867723]
We introduce Generalized Simplicial Attention Neural Networks (GSANs)
GSANs process data living on simplicial complexes using masked self-attentional layers.
These schemes learn how to combine data associated with neighbor simplices of consecutive order in a task-oriented fashion.
arXiv Detail & Related papers (2023-09-05T11:29:25Z) - Interaction Measures, Partition Lattices and Kernel Tests for High-Order
Interactions [1.9457612782595313]
Non-trivial dependencies between groups of more than two variables can play a significant role in the analysis and modelling of such systems.
We introduce a hierarchy of $d$-order ($d geq 2$) interaction measures, increasingly inclusive of possible factorisations of the joint probability distribution.
We also establish mathematical links with lattice theory, which elucidate the derivation of the interaction measures and their composite permutation tests.
arXiv Detail & Related papers (2023-06-01T16:59:37Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Topological Deep Learning: Going Beyond Graph Data [26.325857542512047]
We present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains.
Specifically, we first introduce complexes, a novel type of topological domain.
We develop a class of message-passing complex neural networks (CCNNs) focusing primarily on attention-based CCNNs.
arXiv Detail & Related papers (2022-06-01T16:21:28Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Cooperative Policy Learning with Pre-trained Heterogeneous Observation
Representations [51.8796674904734]
We propose a new cooperative learning framework with pre-trained heterogeneous observation representations.
We employ an encoder-decoder based graph attention to learn the intricate interactions and heterogeneous representations.
arXiv Detail & Related papers (2020-12-24T04:52:29Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z) - Revealing the Invisible with Model and Data Shrinking for
Composite-database Micro-expression Recognition [49.463864096615254]
We analyze the influence of learning complexity, including the input complexity and model complexity.
We propose a recurrent convolutional network (RCN) to explore the shallower-architecture and lower-resolution input data.
We develop three parameter-free modules to integrate with RCN without increasing any learnable parameters.
arXiv Detail & Related papers (2020-06-17T06:19:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.