A Theoretical Analysis of Compositional Generalization in Neural Networks: A Necessary and Sufficient Condition
- URL: http://arxiv.org/abs/2505.02627v1
- Date: Mon, 05 May 2025 13:13:46 GMT
- Title: A Theoretical Analysis of Compositional Generalization in Neural Networks: A Necessary and Sufficient Condition
- Authors: Yuanpeng Li,
- Abstract summary: This paper derives a necessary and sufficient condition for compositional generalization in neural networks.<n> Conceptually, it requires that (i) the computational graph matches the true compositional structure, and (ii) components encode just enough information in training.
- Score: 3.09765163299025
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Compositional generalization is a crucial property in artificial intelligence, enabling models to handle novel combinations of known components. While most deep learning models lack this capability, certain models succeed in specific tasks, suggesting the existence of governing conditions. This paper derives a necessary and sufficient condition for compositional generalization in neural networks. Conceptually, it requires that (i) the computational graph matches the true compositional structure, and (ii) components encode just enough information in training. The condition is supported by mathematical proofs. This criterion combines aspects of architecture design, regularization, and training data properties. A carefully designed minimal example illustrates an intuitive understanding of the condition. We also discuss the potential of the condition for assessing compositional generalization before training. This work is a fundamental theoretical study of compositional generalization in neural networks.
Related papers
- Propositional Logic for Probing Generalization in Neural Networks [3.6037930269014633]
We investigate the generalization behavior of three key neural architectures (Transformers, Graph Convolution Networks and LSTMs) in a controlled task rooted in propositional logic.<n>We find thatTransformers fail to apply negation compositionally, unless structural biases are introduced.<n>Our findings highlight persistent limitations in the ability of standard architectures to learn systematic representations of logical operators.
arXiv Detail & Related papers (2025-06-10T16:46:05Z) - When does compositional structure yield compositional generalization? A kernel theory [0.0]
We present a theory of compositional generalization in kernel models with fixed, compositionally structured representations.<n>We identify novel failure modes in compositional generalization that arise from biases in the training data.<n>This work examines how statistical structure in the training data can affect compositional generalization.
arXiv Detail & Related papers (2024-05-26T00:50:11Z) - What makes Models Compositional? A Theoretical View: With Supplement [60.284698521569936]
We propose a general neuro-symbolic definition of compositional functions and their compositional complexity.
We show how various existing general and special purpose sequence processing models fit this definition and use it to analyze their compositional complexity.
arXiv Detail & Related papers (2024-05-02T20:10:27Z) - A Recursive Bateson-Inspired Model for the Generation of Semantic Formal
Concepts from Spatial Sensory Data [77.34726150561087]
This paper presents a new symbolic-only method for the generation of hierarchical concept structures from complex sensory data.
The approach is based on Bateson's notion of difference as the key to the genesis of an idea or a concept.
The model is able to produce fairly rich yet human-readable conceptual representations without training.
arXiv Detail & Related papers (2023-07-16T15:59:13Z) - Compositional Generalization Requires Compositional Parsers [69.77216620997305]
We compare sequence-to-sequence models and models guided by compositional principles on the recent COGS corpus.
We show structural generalization is a key measure of compositional generalization and requires models that are aware of complex structure.
arXiv Detail & Related papers (2022-02-24T07:36:35Z) - Compositional Processing Emerges in Neural Networks Solving Math
Problems [100.80518350845668]
Recent progress in artificial neural networks has shown that when large models are trained on enough linguistic data, grammatical structure emerges in their representations.
We extend this work to the domain of mathematical reasoning, where it is possible to formulate precise hypotheses about how meanings should be composed.
Our work shows that neural networks are not only able to infer something about the structured relationships implicit in their training data, but can also deploy this knowledge to guide the composition of individual meanings into composite wholes.
arXiv Detail & Related papers (2021-05-19T07:24:42Z) - Concepts, Properties and an Approach for Compositional Generalization [2.0559497209595823]
This report connects a series of our work for compositional generalization, and summarizes an approach.
The approach uses architecture design and regularization to regulate information of representations.
We hope this work would be helpful to clarify fundamentals of compositional generalization and lead to advance artificial intelligence.
arXiv Detail & Related papers (2021-02-08T14:22:30Z) - Compositional Generalization by Learning Analytical Expressions [87.15737632096378]
A memory-augmented neural model is connected with analytical expressions to achieve compositional generalization.
Experiments on the well-known benchmark SCAN demonstrate that our model seizes a great ability of compositional generalization.
arXiv Detail & Related papers (2020-06-18T15:50:57Z) - A Study of Compositional Generalization in Neural Models [22.66002315559978]
We introduce ConceptWorld, which enables the generation of images from compositional and relational concepts.
We perform experiments to test the ability of standard neural networks to generalize on relations with compositional arguments.
For simple problems, all models generalize well to close concepts but struggle with longer compositional chains.
arXiv Detail & Related papers (2020-06-16T18:29:58Z) - CSNE: Conditional Signed Network Embedding [77.54225346953069]
Signed networks encode positive and negative relations between entities such as friend/foe or trust/distrust.
Existing embedding methods for sign prediction generally enforce different notions of status or balance theories in their optimization function.
We introduce conditional signed network embedding (CSNE)
Our probabilistic approach models structural information about the signs in the network separately from fine-grained detail.
arXiv Detail & Related papers (2020-05-19T19:14:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.