Temporal Model On Quantum Logic
- URL: http://arxiv.org/abs/2502.07817v1
- Date: Sun, 09 Feb 2025 17:16:53 GMT
- Title: Temporal Model On Quantum Logic
- Authors: Francesco D'Agostino,
- Abstract summary: The framework formalizes the evolution of propositions over time using linear and branching temporal models.
The hierarchical organization of memory is represented using directed acyclic graphs.
- Score: 0.0
- License:
- Abstract: This paper introduces a unified theoretical framework for modeling temporal memory dynamics, combining concepts from temporal logic, memory decay models, and hierarchical contexts. The framework formalizes the evolution of propositions over time using linear and branching temporal models, incorporating exponential decay (Ebbinghaus forgetting curve) and reactivation mechanisms via Bayesian updating. The hierarchical organization of memory is represented using directed acyclic graphs to model recall dependencies and interference. Novel insights include feedback dynamics, recursive influences in memory chains, and the integration of entropy-based recall efficiency. This approach provides a foundation for understanding memory processes across cognitive and computational domains.
Related papers
- Exploring Synaptic Resonance in Large Language Models: A Novel Approach to Contextual Memory Integration [0.0]
A novel mechanism, Synaptic Resonance, is introduced to dynamically reinforce relevant memory pathways during training and inference.
Evaluations conducted on an open-source language model demonstrate reductions in perplexity, enhancements in contextual coherence, and increased robustness against input noise.
arXiv Detail & Related papers (2025-02-15T07:06:10Z) - Test-time regression: a unifying framework for designing sequence models with associative memory [24.915262407519876]
We show that effective sequence models must be able to perform associative recall.
Our key insight is that memorizing input tokens through an associative memory is equivalent to performing regression at test-time.
We show numerous recent architectures -- including linear attention models, their gated variants, state-space models, online learners, and softmax attention -- emerge naturally as specific approaches to test-time regression.
arXiv Detail & Related papers (2025-01-21T18:32:31Z) - Firing Rate Models as Associative Memory: Excitatory-Inhibitory Balance for Robust Retrieval [3.961279440272764]
Firing rate models are dynamical systems widely used in applied and theoretical neuroscience to describe local cortical dynamics in neuronal populations.
We propose a general framework that ensures the emergence of re-scaled memory patterns as stable equilibria in the firing rate dynamics.
We analyze the conditions under which the memories are locally and globally stable, providing insights into constructing biologically-plausible and robust systems for associative memory retrieval.
arXiv Detail & Related papers (2024-11-11T21:40:57Z) - Input-Driven Dynamics for Robust Memory Retrieval in Hopfield Networks [3.961279440272764]
The Hopfield model provides a mathematically idealized yet insightful framework for understanding the mechanisms of memory storage and retrieval in the human brain.
We propose a novel system framework in which the external input directly influences the neural synapses and shapes the energy landscape of the Hopfield model.
This plasticity-based mechanism provides a clear energetic interpretation of the memory retrieval process and proves effective at correctly classifying highly mixed inputs.
arXiv Detail & Related papers (2024-11-06T17:24:25Z) - Discrete, compositional, and symbolic representations through attractor dynamics [51.20712945239422]
We introduce a novel neural systems model that integrates attractor dynamics with symbolic representations to model cognitive processes akin to the probabilistic language of thought (PLoT)
Our model segments the continuous representational space into discrete basins, with attractor states corresponding to symbolic sequences, that reflect the semanticity and compositionality characteristic of symbolic systems through unsupervised learning, rather than relying on pre-defined primitives.
This approach establishes a unified framework that integrates both symbolic and sub-symbolic processing through neural dynamics, a neuroplausible substrate with proven expressivity in AI, offering a more comprehensive model that mirrors the complex duality of cognitive operations
arXiv Detail & Related papers (2023-10-03T05:40:56Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - On the Relationship Between Variational Inference and Auto-Associative
Memory [68.8204255655161]
We study how different neural network approaches to variational inference can be applied in this framework.
We evaluate the obtained algorithms on the CIFAR10 and CLEVR image datasets and compare them with other associative memory models.
arXiv Detail & Related papers (2022-10-14T14:18:47Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Towards a Predictive Processing Implementation of the Common Model of
Cognition [79.63867412771461]
We describe an implementation of the common model of cognition grounded in neural generative coding and holographic associative memory.
The proposed system creates the groundwork for developing agents that learn continually from diverse tasks as well as model human performance at larger scales.
arXiv Detail & Related papers (2021-05-15T22:55:23Z) - On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis [30.75240284934018]
We consider the simple but representative setting of using continuous-time linear RNNs to learn from data generated by linear relationships.
We prove a universal approximation theorem of such linear functionals, and characterize the approximation rate and its relation with memory.
A unifying theme uncovered is the non-trivial effect of memory, a notion that can be made precise in our framework.
arXiv Detail & Related papers (2020-09-16T16:48:28Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.