Compiling Turing Machines into Storage Modification Machines
- URL: http://arxiv.org/abs/2110.01415v1
- Date: Tue, 28 Sep 2021 10:38:05 GMT
- Title: Compiling Turing Machines into Storage Modification Machines
- Authors: J.-M. Chauvet
- Abstract summary: It is well known that Sch"onhage's Storage Modification Machines (SMM) can simulate Turing Machines (TM)
We propose a simple transformation of TM into SMM, setting the base for a straightforward TM-to-SMM compiler.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: It is well known that Sch\"onhage's Storage Modification Machines (SMM) can
simulate Turing Machines (TM) since Sch\"onhage's original proof of the Turing
completeness of the eponymous machines. We propose a simple transformation of
TM into SMM, setting the base for a straightforward TM-to-SMM compiler.
Related papers
- Self-Replicating Mechanical Universal Turing Machine [0.0]
This paper presents the implementation of a self-replicating finite-state machine (FSM) and a self-replicating Turing Machine (TM) using bio-inspired mechanisms.
arXiv Detail & Related papers (2024-09-27T08:28:37Z) - On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning [87.73401758641089]
Chain-of-thought (CoT) reasoning has improved the performance of modern language models (LMs)
We show that LMs can represent the same family of distributions over strings as probabilistic Turing machines.
arXiv Detail & Related papers (2024-06-20T10:59:02Z) - MoEUT: Mixture-of-Experts Universal Transformers [75.96744719516813]
Universal Transformers (UTs) have advantages over standard Transformers in learning compositional generalizations.
Layer-sharing drastically reduces the parameter count compared to the non-shared model with the same dimensionality.
No previous work has succeeded in proposing a shared-layer Transformer design that is competitive in parameter count-dominated tasks such as language modeling.
arXiv Detail & Related papers (2024-05-25T03:24:32Z) - On the Representational Capacity of Recurrent Neural Language Models [56.19166912044362]
We show that a rationally weighted RLM with computation time can simulate any deterministic probabilistic Turing machine (PTM) with rationally weighted transitions.
We also provide a lower bound by showing that under the restriction to real-time computation, such models can simulate deterministic real-time rational PTMs.
arXiv Detail & Related papers (2023-10-19T17:39:47Z) - Sparse Universal Transformer [64.78045820484299]
The Universal Transformer (UT) is a variant of the Transformer that shares parameters across its layers.
This paper proposes the Sparse Universal Transformer (SUT), which leverages Sparse Mixture of Experts (SMoE) and a new stick-breaking-based dynamic halting mechanism.
arXiv Detail & Related papers (2023-10-11T00:38:57Z) - Towards a Self-Replicating Turing Machine [0.0]
We provide partial implementations of von Neumann's universal constructor and universal copier.
Using the same principles, we also construct Turing machines.
Our construction allows for mutations if desired, and we give a simple description language.
arXiv Detail & Related papers (2023-06-29T11:50:58Z) - Token Turing Machines [53.22971546637947]
Token Turing Machines (TTM) is a sequential, autoregressive Transformer model with memory for real-world sequential visual understanding.
Our model is inspired by the seminal Neural Turing Machine, and has an external memory consisting of a set of tokens which summarise the previous history.
arXiv Detail & Related papers (2022-11-16T18:59:18Z) - Multiway Storage Modification Machines [0.0]
We present a parallel version of Sch"onhage's Storage Modification Machine, the Multiway Storage Modification Machine (MWSMM)
Like the alternative Association Storage Modification Machine of Tromp van Emde Boas, MWSMMs recognize in time what Turing Machines recognize in space.
arXiv Detail & Related papers (2021-11-12T15:06:48Z) - Prof. Sch\"onhage's Mysterious Machines [0.0]
We give a simple Sch"onhage Storage Modification Machine that simulates one of the Rule 110 cellular automatons.
This provides an alternative construction to Sch"onhage's original proof of the Turing completeness of the eponymous machines.
arXiv Detail & Related papers (2021-08-19T10:32:21Z) - Reservoir memory machines [79.79659145328856]
We propose reservoir memory machines, which are able to solve some of the benchmark tests for Neural Turing Machines.
Our model can also be seen as an extension of echo state networks with an external memory, enabling arbitrarily long storage without interference.
arXiv Detail & Related papers (2020-02-12T01:45:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.