Cell Complex Neural Networks
- URL: http://arxiv.org/abs/2010.00743v4
- Date: Tue, 2 Mar 2021 03:50:54 GMT
- Title: Cell Complex Neural Networks
- Authors: Mustafa Hajij, Kyle Istvan, Ghada Zamzmi
- Abstract summary: We propose a general, unifying construction for performing neural network-type computations on cell complexes.
We show how our cell complex autoencoder construction can give, in the special case textbfcell2vec, a generalization for node2vec.
- Score: 0.8121462458089141
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cell complexes are topological spaces constructed from simple blocks called
cells. They generalize graphs, simplicial complexes, and polyhedral complexes
that form important domains for practical applications. They also provide a
combinatorial formalism that allows the inclusion of complicated relationships
of restrictive structures such as graphs and meshes. In this paper, we propose
\textbf{Cell Complexes Neural Networks (CXNs)}, a general, combinatorial and
unifying construction for performing neural network-type computations on cell
complexes. We introduce an inter-cellular message passing scheme on cell
complexes that takes the topology of the underlying space into account and
generalizes message passing scheme to graphs. Finally, we introduce a unified
cell complex encoder-decoder framework that enables learning representation of
cells for a given complex inside the Euclidean spaces. In particular, we show
how our cell complex autoencoder construction can give, in the special case
\textbf{cell2vec}, a generalization for node2vec.
Related papers
- Attending to Topological Spaces: The Cellular Transformer [37.84207797241944]
Topological Deep Learning seeks to enhance the predictive performance of neural network models by harnessing topological structures in input data.
We introduce the Cellular Transformer (CT), a novel architecture that generalizes graph-based transformers to cell complexes.
CT achieves state-of-the-art performance, but it does so without the need for more complex enhancements.
arXiv Detail & Related papers (2024-05-23T01:48:32Z) - Defining Neural Network Architecture through Polytope Structures of Dataset [53.512432492636236]
This paper defines upper and lower bounds for neural network widths, which are informed by the polytope structure of the dataset in question.
We develop an algorithm to investigate a converse situation where the polytope structure of a dataset can be inferred from its corresponding trained neural networks.
It is established that popular datasets such as MNIST, Fashion-MNIST, and CIFAR10 can be efficiently encapsulated using no more than two polytopes with a small number of faces.
arXiv Detail & Related papers (2024-02-04T08:57:42Z) - Discovering modular solutions that generalize compositionally [55.46688816816882]
We show that identification up to linear transformation purely from demonstrations is possible without having to learn an exponential number of module combinations.
We further demonstrate empirically that meta-learning from finite data can discover modular policies that generalize compositionally in a number of complex environments.
arXiv Detail & Related papers (2023-12-22T16:33:50Z) - Combinatorial Complexes: Bridging the Gap Between Cell Complexes and
Hypergraphs [18.793940779717627]
We argue that hypergraphs and cell complexes emphasize emphdifferent types of relations, which may have different utility depending on the application context.
We discuss the relative advantages of these two choices and elaborate on the previously introduced concept of a complex that enables co-existing set-type and hierarchical relations.
arXiv Detail & Related papers (2023-12-15T03:04:28Z) - Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision [0.0]
A neural network consists of piecewise affine building blocks, such as fully-connected layers and ReLU activations.
This complex has been previously studied to characterize theoretical properties of neural networks.
We propose to subdivide the regions via intersections with hyperplanes induced by each neuron.
arXiv Detail & Related papers (2023-06-12T16:17:04Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - Geometry Interaction Knowledge Graph Embeddings [153.69745042757066]
We propose Geometry Interaction knowledge graph Embeddings (GIE), which learns spatial structures interactively between the Euclidean, hyperbolic and hyperspherical spaces.
Our proposed GIE can capture a richer set of relational information, model key inference patterns, and enable expressive semantic matching across entities.
arXiv Detail & Related papers (2022-06-24T08:33:43Z) - Topological Deep Learning: Going Beyond Graph Data [26.325857542512047]
We present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains.
Specifically, we first introduce complexes, a novel type of topological domain.
We develop a class of message-passing complex neural networks (CCNNs) focusing primarily on attention-based CCNNs.
arXiv Detail & Related papers (2022-06-01T16:21:28Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Signal Processing on Cell Complexes [7.0471949371778795]
We give an introduction to signal processing on (abstract) regular cell complexes.
We discuss how appropriate Hodge Laplacians for these cell complexes can be derived.
arXiv Detail & Related papers (2021-10-11T21:11:59Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.