Product Interaction: An Algebraic Formalism for Deep Learning Architectures
- URL: http://arxiv.org/abs/2602.02573v1
- Date: Sat, 31 Jan 2026 07:14:01 GMT
- Title: Product Interaction: An Algebraic Formalism for Deep Learning Architectures
- Authors: Haonan Dong, Chun-Wun Cheng, Angelica I. Aviles-Rivero,
- Abstract summary: Product interactions are a formalism in which neural network layers are constructed from compositions of a multiplication operator defined over suitable algebras.<n>Our central observation is that algebraic expressions in modern neural networks admit a unified construction in terms of linear, quadratic, and higher-order product interactions.
- Score: 1.1885785138453553
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we introduce product interactions, an algebraic formalism in which neural network layers are constructed from compositions of a multiplication operator defined over suitable algebras. Product interactions provide a principled way to generate and organize algebraic expressions by increasing interaction order. Our central observation is that algebraic expressions in modern neural networks admit a unified construction in terms of linear, quadratic, and higher-order product interactions. Convolutional and equivariant networks arise as symmetry-constrained linear product interactions, while attention and Mamba correspond to higher-order product interactions.
Related papers
- Grokking Finite-Dimensional Algebra [5.471648649900293]
grokking refers to the sudden transition from a long memorization to generalization observed during neural networks training.<n>We show that grokking emerges naturally as models must learn discrete representations of algebraic elements.
arXiv Detail & Related papers (2026-02-23T05:55:52Z) - The metaplectic semigroup and its applications to time-frequency analysis and evolution operators [0.0]
We develop a systematic analysis of the metaplectic semigroup $mathrmMp_+(d,mathbbC)$ associated with positive complex symplectic matrices.<n>We exploit these structural results to characterize, from a metaplectic perspective, classes of time-frequency representations satisfying prescribed structural properties.
arXiv Detail & Related papers (2026-01-29T19:21:40Z) - Why Neural Network Can Discover Symbolic Structures with Gradient-based Training: An Algebraic and Geometric Foundation for Neurosymbolic Reasoning [73.18052192964349]
We develop a theoretical framework that explains how discrete symbolic structures can emerge naturally from continuous neural network training dynamics.<n>By lifting neural parameters to a measure space and modeling training as Wasserstein gradient flow, we show that under geometric constraints, the parameter measure $mu_t$ undergoes two concurrent phenomena.
arXiv Detail & Related papers (2025-06-26T22:40:30Z) - Hadamard product in deep learning: Introduction, Advances and Challenges [68.26011575333268]
This survey examines a fundamental yet understudied primitive: the Hadamard product.<n>Despite its widespread implementation across various applications, the Hadamard product has not been systematically analyzed as a core architectural primitive.<n>We present the first comprehensive taxonomy of its applications in deep learning, identifying four principal domains: higher-order correlation, multimodal data fusion, dynamic representation modulation, and efficient pairwise operations.
arXiv Detail & Related papers (2025-04-17T17:26:29Z) - Algebras of Interaction and Cooperation [0.0]
Systems of cooperation and interaction are represented in vector spaces with multiplicative structures in algebras.
Basic interpretations of natural numbers yield natural algebras and offer a unifying view on cooperation and interaction.
arXiv Detail & Related papers (2024-04-18T08:01:43Z) - Learning Hierarchical Relational Representations through Relational Convolutions [2.5322020135765464]
We introduce "relational convolutional networks", a neural architecture equipped with computational mechanisms that capture progressively more complex relational features.
A key component of this framework is a novel operation that captures relational patterns in groups of objects by convolving graphlet filters.
We present the motivation and details of the architecture, together with a set of experiments to demonstrate how relational convolutional networks can provide an effective framework for modeling relational tasks that have hierarchical structure.
arXiv Detail & Related papers (2023-10-05T01:22:50Z) - On Neural Architecture Inductive Biases for Relational Tasks [76.18938462270503]
We introduce a simple architecture based on similarity-distribution scores which we name Compositional Network generalization (CoRelNet)
We find that simple architectural choices can outperform existing models in out-of-distribution generalizations.
arXiv Detail & Related papers (2022-06-09T16:24:01Z) - String-net construction of RCFT correlators [3.803664831016232]
We use string-net models to accomplish a direct, purely two-dimensional, approach to correlators of rational conformal field theories.
We derive idempotents for objects describing bulk and boundary fields in terms of idempotents in the cylinder category of the underlying modular fusion category.
We also derive an Eckmann-Hilton relation internal to a braided category, thereby demonstrating the utility of string nets for understanding algebra in braided tensor categories.
arXiv Detail & Related papers (2021-12-23T16:57:26Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z) - DensE: An Enhanced Non-commutative Representation for Knowledge Graph
Embedding with Adaptive Semantic Hierarchy [4.607120217372668]
We develop a novel knowledge graph embedding method, named DensE, to provide an improved modeling scheme for the complex composition patterns of relations.
Our method decomposes each relation into an SO(3) group-based rotation operator and a scaling operator in the three dimensional (3-D) Euclidean space.
Experimental results on multiple benchmark knowledge graphs show that DensE outperforms the current state-of-the-art models for missing link prediction.
arXiv Detail & Related papers (2020-08-11T06:45:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.