Spectral Convergence of Complexon Shift Operators
- URL: http://arxiv.org/abs/2309.07169v4
- Date: Sun, 5 May 2024 10:52:26 GMT
- Title: Spectral Convergence of Complexon Shift Operators
- Authors: Purui Zhang, Xingchao Jian, Feng Ji, Wee Peng Tay, Bihan Wen,
- Abstract summary: We study the transferability of Topological Signal Processing via a generalized higher-order version of graphon, known as complexon.
Inspired by the graphon shift operator and message-passing neural network, we construct a marginal complexon and complexon shift operator.
We prove that when a simplicial complex signal sequence converges to a complexon signal, the eigenvalues, eigenspaces, and Fourier transform of the corresponding CSOs converge to that of the limit complexon signal.
- Score: 38.89310649097387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Topological Signal Processing (TSP) utilizes simplicial complexes to model structures with higher order than vertices and edges. In this paper, we study the transferability of TSP via a generalized higher-order version of graphon, known as complexon. We recall the notion of a complexon as the limit of a simplicial complex sequence [1]. Inspired by the graphon shift operator and message-passing neural network, we construct a marginal complexon and complexon shift operator (CSO) according to components of all possible dimensions from the complexon. We investigate the CSO's eigenvalues and eigenvectors and relate them to a new family of weighted adjacency matrices. We prove that when a simplicial complex signal sequence converges to a complexon signal, the eigenvalues, eigenspaces, and Fourier transform of the corresponding CSOs converge to that of the limit complexon signal. This conclusion is further verified by two numerical experiments. These results hint at learning transferability on large simplicial complexes or simplicial complex sequences, which generalize the graphon signal processing framework.
Related papers
- Generalization of Modular Spread Complexity for Non-Hermitian Density Matrices [0.0]
In this work we generalize the concept of modular spread complexity to the cases where the reduced density matrix is non-Hermitian.
We define the quantity pseudo-capacity which generalizes capacity of entanglement, and corresponds to the early modular-time measure of pseudo-modular complexity.
We show some analytical calculations for 2-level systems and 4-qubit models and then do numerical investigations on the quantum phase transition of transverse field Ising model.
arXiv Detail & Related papers (2024-10-07T17:59:16Z) - Inducing Systematicity in Transformers by Attending to Structurally
Quantized Embeddings [60.698130703909804]
Transformers generalize to novel compositions of structures and entities after being trained on a complex dataset.
We propose SQ-Transformer that explicitly encourages systematicity in the embeddings and attention layers.
We show that SQ-Transformer achieves stronger compositional generalization than the vanilla Transformer on multiple low-complexity semantic parsing and machine translation datasets.
arXiv Detail & Related papers (2024-02-09T15:53:15Z) - How Do Transformers Learn In-Context Beyond Simple Functions? A Case
Study on Learning with Representations [98.7450564309923]
This paper takes initial steps on understanding in-context learning (ICL) in more complex scenarios, by studying learning with representations.
We construct synthetic in-context learning problems with a compositional structure, where the label depends on the input through a possibly complex but fixed representation function.
We show theoretically the existence of transformers that approximately implement such algorithms with mild depth and size.
arXiv Detail & Related papers (2023-10-16T17:40:49Z) - SC-MAD: Mixtures of Higher-order Networks for Data Augmentation [36.33265644447091]
The simplicial complex has inspired generalizations of graph neural networks (GNNs) to simplicial complex-based models.
We propose data augmentation of simplicial complexes through both linear and nonlinear mixup mechanisms.
We theoretically demonstrate that the resultant synthetic simplicial complexes interpolate among existing data with respect to homomorphism densities.
arXiv Detail & Related papers (2023-09-14T06:25:39Z) - Spectral Complexity-scaled Generalization Bound of Complex-valued Neural
Networks [78.64167379726163]
This paper is the first work that proves a generalization bound for the complex-valued neural network.
We conduct experiments by training complex-valued convolutional neural networks on different datasets.
arXiv Detail & Related papers (2021-12-07T03:25:25Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Signal Processing on Cell Complexes [7.0471949371778795]
We give an introduction to signal processing on (abstract) regular cell complexes.
We discuss how appropriate Hodge Laplacians for these cell complexes can be derived.
arXiv Detail & Related papers (2021-10-11T21:11:59Z) - Signal processing on simplicial complexes [19.035399031968502]
We focus on a closely related, but distinct third perspective: how can we use higher-order relationships to process signals and data supported on higher-order network structures.
In particular, we survey how ideas from signal processing of data supported on regular domains, such as time series or images, can be extended to graphs and simplicial complexes.
arXiv Detail & Related papers (2021-06-14T14:56:51Z) - Bilinear Classes: A Structural Framework for Provable Generalization in
RL [119.42509700822484]
Bilinear Classes is a new structural framework which permits generalization in reinforcement learning.
The framework incorporates nearly all existing models in which a sample complexity is achievable.
Our main result provides an RL algorithm which has sample complexity for Bilinear Classes.
arXiv Detail & Related papers (2021-03-19T16:34:20Z) - Signal Processing on Higher-Order Networks: Livin' on the Edge ... and
Beyond [20.422050836383725]
This tutorial paper presents a didactic treatment of the emerging topic of signal processing on higher-order networks.
We introduce the building blocks for processing data on simplicial complexes and hypergraphs.
arXiv Detail & Related papers (2021-01-14T09:08:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.