CW-CNN & CW-AN: Convolutional Networks and Attention Networks for CW-Complexes
- URL: http://arxiv.org/abs/2408.16686v2
- Date: Thu, 5 Sep 2024 15:11:40 GMT
- Title: CW-CNN & CW-AN: Convolutional Networks and Attention Networks for CW-Complexes
- Authors: Rahul Khorana,
- Abstract summary: We present a novel framework for learning on CW-complex structured data points.
We develop notions of convolution and attention that are well defined for CW-complexes.
We create the first Hodge informed neural network that can receive a CW-complex as input.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a novel framework for learning on CW-complex structured data points. Recent advances have discussed CW-complexes as ideal learning representations for problems in cheminformatics. However, there is a lack of available machine learning methods suitable for learning on CW-complexes. In this paper we develop notions of convolution and attention that are well defined for CW-complexes. These notions enable us to create the first Hodge informed neural network that can receive a CW-complex as input. We illustrate and interpret this framework in the context of supervised prediction.
Related papers
- Families of Optimal Transport Kernels for Cell Complexes [0.0]
We derive an explicit expression for the Wasserstein distance between cell complex signal distributions in terms of a Hodge-Laplacian matrix.<n>This leads to a structurally meaningful measure to compare CW complexes and define the optimal transportation map.<n>In order to simultaneously include both feature and structure information, we extend the Fused Gromov-Wasserstein distance to CW complexes.
arXiv Detail & Related papers (2025-07-22T13:21:28Z) - CVKAN: Complex-Valued Kolmogorov-Arnold Networks [6.095572174539791]
We show how to transfer a complex-valued Kolmogorov-Arnold Network into the complex domain.
Our proposed CVKAN is more stable and performs on par or better than real-valued KANs.
arXiv Detail & Related papers (2025-02-04T15:38:14Z) - Convolutional Filtering with RKHS Algebras [110.06688302593349]
We develop a theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS)
We show that any RKHS allows the formal definition of multiple algebraic convolutional models.
We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles.
arXiv Detail & Related papers (2024-11-02T18:53:44Z) - Comprehensive Survey of Complex-Valued Neural Networks: Insights into Backpropagation and Activation Functions [0.0]
Despite the prevailing use of real-number implementations in current ANN frameworks, there is a growing interest in developing ANNs that utilize complex numbers.
This paper presents a survey of recent advancements in complex-valued neural networks (CVNNs)
We delve into the extension of the backpropagation algorithm to the complex domain, which enables the training of neural networks with complex-valued inputs, weights, AFs, and outputs.
arXiv Detail & Related papers (2024-07-27T13:47:16Z) - The Cooperative Network Architecture: Learning Structured Networks as Representation of Sensory Patterns [3.9848584845601014]
We introduce the Cooperative Network Architecture (CNA), a model that represents sensory signals using structured, recurrently connected networks of neurons, termed "nets"
We demonstrate that net fragments can be learned without supervision and flexibly recombined to encode novel patterns, enabling figure completion and resilience to noise.
arXiv Detail & Related papers (2024-07-08T06:22:10Z) - Bayesian Neural Networks Avoid Encoding Complex and
Perturbation-Sensitive Concepts [22.873523599349326]
In this paper, we focus on mean-field variational Bayesian Neural Networks (BNNs) and explore the representation capacity of such BNNs.
It has been observed and studied that a relatively small set of interactive concepts usually emerge in the knowledge representation of a sufficiently-trained neural network.
Our study proves that compared to standard deep neural networks (DNNs), it is less likely for BNNs to encode complex concepts.
arXiv Detail & Related papers (2023-02-25T14:56:35Z) - On Rademacher Complexity-based Generalization Bounds for Deep Learning [18.601449856300984]
We show that the Rademacher complexity-based approach can generate non-vacuous generalisation bounds on Convolutional Neural Networks (CNNs)
Our results show that the Rademacher complexity does not depend on the network length for CNNs with some special types of activation functions such as ReLU, Leaky ReLU, Parametric Rectifier Linear Unit, Sigmoid, and Tanh.
arXiv Detail & Related papers (2022-08-08T17:24:04Z) - Learning with Capsules: A Survey [73.31150426300198]
Capsule networks were proposed as an alternative approach to Convolutional Neural Networks (CNNs) for learning object-centric representations.
Unlike CNNs, capsule networks are designed to explicitly model part-whole hierarchical relationships.
arXiv Detail & Related papers (2022-06-06T15:05:36Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Optimizing Tensor Network Contraction Using Reinforcement Learning [86.05566365115729]
We propose a Reinforcement Learning (RL) approach combined with Graph Neural Networks (GNN) to address the contraction ordering problem.
The problem is extremely challenging due to the huge search space, the heavy-tailed reward distribution, and the challenging credit assignment.
We show how a carefully implemented RL-agent that uses a GNN as the basic policy construct can address these challenges.
arXiv Detail & Related papers (2022-04-18T21:45:13Z) - Local and global topological complexity measures OF ReLU neural network functions [0.0]
We apply a piecewise-linear (PL) version of Morse theory due to Grunert-Kuhnel-Rote to define and study new local and global notions of topological complexity.
We show how to construct, for each such F, a canonical polytopal complex K(F) and a deformation retract of the domain onto K(F), yielding a convenient compact model for performing calculations.
arXiv Detail & Related papers (2022-04-12T19:49:13Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Parallel Machine Learning for Forecasting the Dynamics of Complex
Networks [0.0]
We present a machine learning scheme for forecasting the dynamics of large complex networks.
We use a parallel architecture that mimics the topology of the network of interest.
arXiv Detail & Related papers (2021-08-27T06:06:41Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.