On Delta-Homology Analogy: Memory as Structured Trajectories
- URL: http://arxiv.org/abs/2303.04203v2
- Date: Fri, 15 Aug 2025 22:10:07 GMT
- Title: On Delta-Homology Analogy: Memory as Structured Trajectories
- Authors: Xin Li,
- Abstract summary: We introduce the emphdelta-homology analogy, which formalizes memory as a set of sparse, topologically irreducible attractors.<n>Based on the analogy, we propose a topological framework for memory and inference grounded in the structure of spike-timing dynamics and persistent homology.
- Score: 5.234742752529437
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: We introduce the \emph{delta-homology analogy}, which formalizes memory as a set of sparse, topologically irreducible attractors. A \emph{Dirac delta-like memory trace} \( \delta_\gamma \) is identified with a nontrivial homology generator \( [\gamma] \in H_1(\mathcal{Z}) \) on a latent manifold of cognitive states. Such traces are sharply localized along reproducible topological cycles and are only activated when inference trajectories complete a full cycle. They encode minimal, path-dependent memory units that cannot be synthesized from local features alone. Based on the analogy, we propose a topological framework for memory and inference grounded in the structure of spike-timing dynamics and persistent homology. Starting from the observation that polychronous neural groups (PNGs) encode reproducible, time-locked spike sequences shaped by axonal delays and synaptic plasticity, we construct \emph{spatiotemporal complexes} whose temporally consistent transitions define chain complexes over which robust activation cycles emerge. These activation loops are abstracted into \emph{cell posets}, enabling a compact and causally ordered representation of neural activity with overlapping and compositional memory traces.
Related papers
- Smooth embeddings in contracting recurrent networks driven by regular dynamics: A synthesis for neural representation [45.88028371034407]
Recent empirical work has documented topology-preserving latent organization in trained recurrent models.<n>Recent theoretical results in reservoir computing establish conditions under which the synchronization map is an embedding.<n>Our contribution is an integrated framework that assembles generalized synchronization and embedding guarantees for contracting reservoirs.
arXiv Detail & Related papers (2026-01-26T23:10:39Z) - Circular Reasoning: Understanding Self-Reinforcing Loops in Large Reasoning Models [66.11277323593475]
Circular Reasoning is a self-reinforcing trap where generated content acts as a logical premise for its own recurrence.<n>Mechanistically, we characterize circular reasoning as a state collapse exhibiting distinct boundaries.<n>We reveal that reasoning impasses trigger the loop onset, which subsequently persists as an inescapable cycle driven by a self-reinforcing V-shaped attention mechanism.
arXiv Detail & Related papers (2026-01-09T10:23:55Z) - Memory-Amortized Inference: A Topological Unification of Search, Closure, and Structure [6.0044467881527614]
We propose textbfMemory-Amortized Inference (MAI), a formal framework that unifies learning and memory as phase transitions of a single geometric substrate.<n>We show that cognition operates by converting high-complexity search into low-complexity lookup.<n>This framework offers a rigorous explanation for the emergence of fast-thinking (intuition) from slow-thinking (reasoning)
arXiv Detail & Related papers (2025-11-28T16:28:24Z) - Cycle is All You Need: More Is Different [6.0044467881527614]
We propose an information-topological framework in which cycle closure is the fundamental mechanism of memory and consciousness.<n>We show that memory is not a static store but the ability to re-enter latent cycles in neural state space.<n>We conclude that cycle is all you need: persistent invariants enable generalization in non-ergodic environments.
arXiv Detail & Related papers (2025-09-15T21:48:30Z) - Memory as Structured Trajectories: Persistent Homology and Contextual Sheaves [5.234742752529437]
We introduce the delta-homology analogy, which formalizes memory as a set of sparse, topologically irreducible attractors.<n>A Dirac delta-like memory trace is identified with a nontrivial homology generator on a latent manifold of cognitive states.<n>We interpret these delta-homology generators as the low-entropy content variable, while the high-entropy context variable is represented dually as a filtration, cohomology class, or sheaf.
arXiv Detail & Related papers (2025-08-01T23:03:13Z) - Cycle-Consistent Helmholtz Machine: Goal-Seeded Simulation via Inverted Inference [5.234742752529437]
We introduce the emphCycle-Consistent Helmholtz Machine (C$2$HM)<n>C$2$HM reframes inference as a emphgoal-seeded, emphasymmetric process grounded in structured internal priors.<n>By offering a biologically inspired alternative to classical amortized inference, $C2$HM reconceives generative modeling as intentional simulation.
arXiv Detail & Related papers (2025-07-03T17:24:27Z) - Why Neural Network Can Discover Symbolic Structures with Gradient-based Training: An Algebraic and Geometric Foundation for Neurosymbolic Reasoning [73.18052192964349]
We develop a theoretical framework that explains how discrete symbolic structures can emerge naturally from continuous neural network training dynamics.<n>By lifting neural parameters to a measure space and modeling training as Wasserstein gradient flow, we show that under geometric constraints, the parameter measure $mu_t$ undergoes two concurrent phenomena.
arXiv Detail & Related papers (2025-06-26T22:40:30Z) - Generative System Dynamics in Recurrent Neural Networks [56.958984970518564]
We investigate the continuous time dynamics of Recurrent Neural Networks (RNNs)<n>We show that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations.<n> Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process.
arXiv Detail & Related papers (2025-04-16T10:39:43Z) - Nonlinearity-driven Topology via Spontaneous Symmetry Breaking [79.16635054977068]
We consider a chain of parametrically-driven quantum resonators coupled only via weak nearest-neighbour cross-Kerr interaction.<n>Topology is dictated by the structure of the Kerr nonlinearity, yielding a non-trivial bulk-boundary correspondence.
arXiv Detail & Related papers (2025-03-15T00:20:45Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Information Topology [6.0044467881527614]
We introduce emphInformation Topology, a framework that unifies information theory and algebraic topology.<n>The starting point is the emphdot-cycle dichotomy, which separates pointwise, order-sensitive fluctuations (dots) from order-invariant, predictive structure (cycles)<n>We then define emphhomological capacity, the topological dual of Shannon capacity, as the number of independent informational cycles supported by a system.
arXiv Detail & Related papers (2022-10-07T23:54:30Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Unification of Random Dynamical Decoupling and the Quantum Zeno Effect [68.8204255655161]
We show that the system dynamics under random dynamical decoupling converges to a unitary with a decoupling error that characteristically depends on the convergence speed of the Zeno limit.
This reveals a unification of the random dynamical decoupling and the quantum Zeno effect.
arXiv Detail & Related papers (2021-12-08T11:41:38Z) - Turing approximations, toric isometric embeddings & manifold
convolutions [0.0]
We define a convolution operator for a manifold of arbitrary topology and dimension.
A result of Alan Turing from 1938 underscores the need for such a toric isometric embedding approach to achieve a global definition of convolution.
arXiv Detail & Related papers (2021-10-05T18:36:16Z) - Statistical Mechanics of Neural Processing of Object Manifolds [3.4809730725241605]
This thesis lays the groundwork for a computational theory of neuronal processing of objects.
We identify that the capacity of a manifold is determined that effective radius, R_M, and effective dimension, D_M.
arXiv Detail & Related papers (2021-06-01T20:49:14Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Quadric hypersurface intersection for manifold learning in feature space [52.83976795260532]
manifold learning technique suitable for moderately high dimension and large datasets.
The technique is learned from the training data in the form of an intersection of quadric hypersurfaces.
At test time, this manifold can be used to introduce an outlier score for arbitrary new points.
arXiv Detail & Related papers (2021-02-11T18:52:08Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z) - Sample complexity and effective dimension for regression on manifolds [13.774258153124205]
We consider the theory of regression on a manifold using kernel reproducing Hilbert space methods.
We show that certain spaces of smooth functions on a manifold are effectively finite-dimensional, with a complexity that scales according to the manifold dimension.
arXiv Detail & Related papers (2020-06-13T14:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.