Structure Transfer: an Inference-Based Calculus for the Transformation of Representations
- URL: http://arxiv.org/abs/2509.03249v2
- Date: Thu, 04 Sep 2025 08:55:32 GMT
- Title: Structure Transfer: an Inference-Based Calculus for the Transformation of Representations
- Authors: Daniel Raggi, Gem Stapleton, Mateja Jamnik, Aaron Stockdill, Grecia Garcia Garcia, Peter C-H. Cheng,
- Abstract summary: A major unsolved problem is how to devise representational-system techniques that drive representation transformation and choice.<n>We present a novel calculus, called structure transfer, that enables representation transformation across diverse RSs.
- Score: 20.889971883203113
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Representation choice is of fundamental importance to our ability to communicate and reason effectively. A major unsolved problem, addressed in this paper, is how to devise representational-system (RS) agnostic techniques that drive representation transformation and choice. We present a novel calculus, called structure transfer, that enables representation transformation across diverse RSs. Specifically, given a source representation drawn from a source RS, the rules of structure transfer allow us to generate a target representation for a target RS. The generality of structure transfer comes in part from its ability to ensure that the source representation and the generated target representation satisfy any specified relation (such as semantic equivalence). This is done by exploiting schemas, which encode knowledge about RSs. Specifically, schemas can express preservation of information across relations between any pair of RSs, and this knowledge is used by structure transfer to derive a structure for the target representation which ensures that the desired relation holds. We formalise this using Representational Systems Theory, building on the key concept of a construction space. The abstract nature of construction spaces grants them the generality to model RSs of diverse kinds, including formal languages, geometric figures and diagrams, as well as informal notations. Consequently, structure transfer is a system-agnostic calculus that can be used to identify alternative representations in a wide range of practical settings.
Related papers
- The Representational Geometry of Number [1.5994376682356057]
We show that number representations preserve a stable relational structure across tasks.<n>We find that task-specific representations are embedded in distinct subspaces, with low-level features like magnitude encoded along separable linear directions.<n>It suggests that understanding arises when task-specific transformations are applied to a shared underlying relational structure of conceptual representations.
arXiv Detail & Related papers (2026-02-06T16:35:22Z) - Why Neural Network Can Discover Symbolic Structures with Gradient-based Training: An Algebraic and Geometric Foundation for Neurosymbolic Reasoning [73.18052192964349]
We develop a theoretical framework that explains how discrete symbolic structures can emerge naturally from continuous neural network training dynamics.<n>By lifting neural parameters to a measure space and modeling training as Wasserstein gradient flow, we show that under geometric constraints, the parameter measure $mu_t$ undergoes two concurrent phenomena.
arXiv Detail & Related papers (2025-06-26T22:40:30Z) - Sparsification and Reconstruction from the Perspective of Representation Geometry [10.834177456685538]
Sparse Autoencoders (SAEs) have emerged as a predominant tool in mechanistic interpretability.<n>This study explains the principles of sparsity from the perspective of representational geometry.<n>Specifically emphasizes the necessity of understanding representations and incorporating representational constraints.
arXiv Detail & Related papers (2025-05-28T15:54:33Z) - Directional Non-Commutative Monoidal Structures for Compositional Embeddings in Machine Learning [0.0]
We introduce a new structure for compositional embeddings built on directional non-commutative monoidal operators.<n>Our construction defines a distinct composition operator circ_i for each axis i, ensuring associative combination along each axis without imposing global commutativity.<n>All axis-specific operators commute with one another, enforcing a global interchange law that enables consistent crossaxis compositions.
arXiv Detail & Related papers (2025-05-21T13:27:14Z) - Structured Representation [2.4214136080186233]
We argue that invariant structures must be where knowledge resides, specifically, as partitions defined by the closure of relational paths within an abstract knowledge space.<n>These partitions serve as the core invariant representations, forming the structural substrate where knowledge is stored and learning occurs.<n>We formalize the computational foundations for structured representation of the invariant partitions based on closed semiring, a relational algebraic structure.
arXiv Detail & Related papers (2025-05-17T21:26:05Z) - Symbolic Disentangled Representations for Images [83.88591755871734]
We propose ArSyD (Architecture for Disentanglement), which represents each generative factor as a vector of the same dimension as the resulting representation.<n>We study ArSyD on the dSprites and CLEVR datasets and provide a comprehensive analysis of the learned symbolic disentangled representations.
arXiv Detail & Related papers (2024-12-25T09:20:13Z) - Identifiable Exchangeable Mechanisms for Causal Structure and Representation Learning [54.69189620971405]
We provide a unified framework, termed Identifiable Exchangeable Mechanisms (IEM), for representation and structure learning.<n>IEM provides new insights that let us relax the necessary conditions for causal structure identification in exchangeable non-i.i.d. data.<n>We also demonstrate the existence of a duality condition in identifiable representation learning, leading to new identifiability results.
arXiv Detail & Related papers (2024-06-20T13:30:25Z) - Representational Systems Theory: A Unified Approach to Encoding,
Analysing and Transforming Representations [3.1252164619375473]
Representational Systems Theory is designed to encode a wide variety of representations from three core perspectives.
It becomes possible to structurally transform representations in one system into representations in another.
arXiv Detail & Related papers (2022-06-07T10:43:27Z) - CoordGAN: Self-Supervised Dense Correspondences Emerge from GANs [129.51129173514502]
We introduce Coordinate GAN (CoordGAN), a structure-texture disentangled GAN that learns a dense correspondence map for each generated image.
We show that the proposed generator achieves better structure and texture disentanglement compared to existing approaches.
arXiv Detail & Related papers (2022-03-30T17:55:09Z) - Frame Averaging for Equivariant Shape Space Learning [85.42901997467754]
A natural way to incorporate symmetries in shape space learning is to ask that the mapping to the shape space (encoder) and mapping from the shape space (decoder) are equivariant to the relevant symmetries.
We present a framework for incorporating equivariance in encoders and decoders by introducing two contributions.
arXiv Detail & Related papers (2021-12-03T06:41:19Z) - Quiver Signal Processing (QSP) [145.6921439353007]
We state the basics for a signal processing framework on quiver representations.
We propose a signal processing framework that allows us to handle heterogeneous multidimensional information in networks.
arXiv Detail & Related papers (2020-10-22T08:40:15Z) - Unsupervised Distillation of Syntactic Information from Contextualized
Word Representations [62.230491683411536]
We tackle the task of unsupervised disentanglement between semantics and structure in neural language representations.
To this end, we automatically generate groups of sentences which are structurally similar but semantically different.
We demonstrate that our transformation clusters vectors in space by structural properties, rather than by lexical semantics.
arXiv Detail & Related papers (2020-10-11T15:13:18Z) - From Spatial Relations to Spatial Configurations [64.21025426604274]
spatial relation language is able to represent a large, comprehensive set of spatial concepts crucial for reasoning.
We show how we extend the capabilities of existing spatial representation languages with the fine-grained decomposition of semantics.
arXiv Detail & Related papers (2020-07-19T02:11:53Z) - Structured (De)composable Representations Trained with Neural Networks [21.198279941828112]
A template representation refers to the generic representation that captures the characteristics of an entire class.
The proposed technique uses end-to-end deep learning to learn structured and composable representations from input images and discrete labels.
We prove that the representations have a clear structure allowing to decompose the representation into factors that represent classes and environments.
arXiv Detail & Related papers (2020-07-07T10:20:31Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z) - Vector symbolic architectures for context-free grammars [0.5862282909017474]
Vector symbolic architectures (VSA) are a viable approach for the hyperdimensional representation of symbolic data.
We present a rigorous framework for the representation of phrase structure trees and parse trees of context-free grammars (CFG) in Fock space.
Our approach could leverage the development of VSA for explainable artificial intelligence (XAI) by means of hyperdimensional deep neural computation.
arXiv Detail & Related papers (2020-03-11T09:07:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.