Towards a Self-Replicating Turing Machine
- URL: http://arxiv.org/abs/2306.16872v1
- Date: Thu, 29 Jun 2023 11:50:58 GMT
- Title: Towards a Self-Replicating Turing Machine
- Authors: Ralph P. Lano
- Abstract summary: We provide partial implementations of von Neumann's universal constructor and universal copier.
Using the same principles, we also construct Turing machines.
Our construction allows for mutations if desired, and we give a simple description language.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We provide partial implementations of von Neumann's universal constructor and
universal copier, starting out with three types of simple building blocks using
minimal assumptions. Using the same principles, we also construct Turing
machines. Combining both, we arrive at a proposal for a self-replicating Turing
machine. Our construction allows for mutations if desired, and we give a simple
description language.
Related papers
- Ask, and it shall be given: Turing completeness of prompting [47.08833920586575]
Large language models (LLMs) have been revolutionizing machine learning and initiated the so-called LLM prompting paradigm.
Here, we present the first theoretical study on the LLM prompting paradigm to the best of our knowledge.
We show that prompting is Turing-complete: there exists a finite-size Transformer such that for any computable function, there exists a corresponding prompt following which the Transformer computes the function.
arXiv Detail & Related papers (2024-11-04T11:26:38Z) - Universal Length Generalization with Turing Programs [24.077722969687898]
We propose Turing Programs, a strategy that decomposes an algorithmic task into steps mimicking a Turing Machine.
By using Turing Programs, we obtain robust length generalization on a range of algorithmic tasks.
We then demonstrate that transformers achieve length generalization on random Turing Programs, suggesting that length generalization is possible for any algorithmic task.
arXiv Detail & Related papers (2024-07-03T17:53:44Z) - On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning [87.73401758641089]
Chain-of-thought (CoT) reasoning has improved the performance of modern language models (LMs)
We show that LMs can represent the same family of distributions over strings as probabilistic Turing machines.
arXiv Detail & Related papers (2024-06-20T10:59:02Z) - Weakening Assumptions for Publicly-Verifiable Deletion [79.61363884631021]
We develop a simple compiler that generically adds publicly-verifiable deletion to a variety of cryptosystems.
Our compiler only makes use of one-way functions.
arXiv Detail & Related papers (2023-04-19T17:51:28Z) - Looped Transformers as Programmable Computers [48.00010456819222]
We present a framework for using transformer networks as universal computers by programming them with specific weights and placing them in a loop.
Our input sequence acts as a punchcard, consisting of instructions and memory for data read/writes.
We show how this transformer, instructed by its input, can emulate a basic calculator, a basic linear algebra library, and in-context learning algorithms that employ backpropagation.
arXiv Detail & Related papers (2023-01-30T18:57:31Z) - On the Intersection of Context-Free and Regular Languages [71.61206349427509]
We generalize the Bar-Hillel construction to handle finite-state automatons with $varepsilon$-arcs.
We prove that our construction leads to a grammar that encodes the structure of both the input automaton and grammar while retaining the size of the original construction.
arXiv Detail & Related papers (2022-09-14T17:49:06Z) - Simple circuit simulations of classical and quantum Turing machines [0.0]
We construct reversible Boolean circuits efficiently simulating reversible Turing machines.
We give a fairly straightforward generalization of the circuits and the simulation proof to the quantum case.
arXiv Detail & Related papers (2021-11-21T14:48:36Z) - What can we learn from universal Turing machines? [0.0]
We construct what we call a pedagogical universal Turing machine.
We try to understand which comparisons with biological phenomena can be deduced from its encoding and from its working.
arXiv Detail & Related papers (2021-10-16T08:43:29Z) - Prof. Sch\"onhage's Mysterious Machines [0.0]
We give a simple Sch"onhage Storage Modification Machine that simulates one of the Rule 110 cellular automatons.
This provides an alternative construction to Sch"onhage's original proof of the Turing completeness of the eponymous machines.
arXiv Detail & Related papers (2021-08-19T10:32:21Z) - Quantum circuit design for universal distribution using a superposition
of classical automata [2.4192504570921622]
This circuit is able to accelerate the inference of algorithmic structures in data for discovering causal generative models.
A classical exhaustive enumeration of all possible programs on the automata is shown for a couple of example cases.
This is the first time, a superposition of classical automata is implemented on the circuit model of quantum computation.
arXiv Detail & Related papers (2020-06-01T14:47:28Z) - Reservoir memory machines [79.79659145328856]
We propose reservoir memory machines, which are able to solve some of the benchmark tests for Neural Turing Machines.
Our model can also be seen as an extension of echo state networks with an external memory, enabling arbitrarily long storage without interference.
arXiv Detail & Related papers (2020-02-12T01:45:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.