Thermodynamic AI and the fluctuation frontier
- URL: http://arxiv.org/abs/2302.06584v3
- Date: Tue, 13 Jun 2023 17:35:52 GMT
- Title: Thermodynamic AI and the fluctuation frontier
- Authors: Patrick J. Coles, Collin Szczepanski, Denis Melanson, Kaelan
Donatella, Antonio J. Martinez, Faris Sbahi
- Abstract summary: Many Artificial Intelligence (AI) algorithms are inspired by physics and employ fluctuations.
We propose a novel computing paradigm, where software and hardware become inseparable.
We identify bits (s-bits) and modes (s-modes) as the respective building blocks for discrete and continuous Thermodynamic AI hardware.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many Artificial Intelligence (AI) algorithms are inspired by physics and
employ stochastic fluctuations. We connect these physics-inspired AI algorithms
by unifying them under a single mathematical framework that we call
Thermodynamic AI. Seemingly disparate algorithmic classes can be described by
this framework, for example, (1) Generative diffusion models, (2) Bayesian
neural networks, (3) Monte Carlo sampling and (4) Simulated annealing. Such
Thermodynamic AI algorithms are currently run on digital hardware, ultimately
limiting their scalability and overall potential. Stochastic fluctuations
naturally occur in physical thermodynamic systems, and such fluctuations can be
viewed as a computational resource. Hence, we propose a novel computing
paradigm, where software and hardware become inseparable. Our algorithmic
unification allows us to identify a single full-stack paradigm, involving
Thermodynamic AI hardware, that could accelerate such algorithms. We contrast
Thermodynamic AI hardware with quantum computing where noise is a roadblock
rather than a resource. Thermodynamic AI hardware can be viewed as a novel form
of computing, since it uses a novel fundamental building block. We identify
stochastic bits (s-bits) and stochastic modes (s-modes) as the respective
building blocks for discrete and continuous Thermodynamic AI hardware. In
addition to these stochastic units, Thermodynamic AI hardware employs a
Maxwell's demon device that guides the system to produce non-trivial states. We
provide a few simple physical architectures for building these devices and we
develop a formalism for programming the hardware via gate sequences. We hope to
stimulate discussion around this new computing paradigm. Beyond acceleration,
we believe it will impact the design of both hardware and algorithms, while
also deepening our understanding of the connection between physics and
intelligence.
Related papers
- Training Neural Networks with Internal State, Unconstrained
Connectivity, and Discrete Activations [66.53734987585244]
True intelligence may require the ability of a machine learning model to manage internal state.
We show that we have not yet discovered the most effective algorithms for training such models.
We present one attempt to design such a training algorithm, applied to an architecture with binary activations and only a single matrix of weights.
arXiv Detail & Related papers (2023-12-22T01:19:08Z) - Thermodynamic Computing System for AI Applications [0.0]
Physics-based hardware, such as thermodynamic computing, has the potential to provide a fast, low-power means to accelerate AI primitives.
We present the first continuous-variable thermodynamic computer, which we call the processing unit (SPU)
arXiv Detail & Related papers (2023-12-08T05:22:04Z) - DiffuseBot: Breeding Soft Robots With Physics-Augmented Generative
Diffusion Models [102.13968267347553]
We present DiffuseBot, a physics-augmented diffusion model that generates soft robot morphologies capable of excelling in a wide spectrum of tasks.
We showcase a range of simulated and fabricated robots along with their capabilities.
arXiv Detail & Related papers (2023-11-28T18:58:48Z) - AI for Mathematics: A Cognitive Science Perspective [86.02346372284292]
Mathematics is one of the most powerful conceptual systems developed and used by the human species.
Rapid progress in AI, particularly propelled by advances in large language models (LLMs), has sparked renewed, widespread interest in building such systems.
arXiv Detail & Related papers (2023-10-19T02:00:31Z) - Thermodynamic Computing via Autonomous Quantum Thermal Machines [0.0]
We develop a physics-based model for classical computation based on autonomous quantum thermal machines.
We show that a network of thermodynamic neurons can perform any desired function.
arXiv Detail & Related papers (2023-08-30T09:15:41Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - Thermodynamic Linear Algebra [0.7377893131680263]
We consider an alternative physics-based computing paradigm based on classical thermodynamics to accelerate linear algebra.
We present simple thermodynamic algorithms for solving linear systems of equations, computing matrix inverses, (3) computing matrix determinants, and (4) solving Lyapunov equations.
Our algorithms exploit thermodynamic principles like ergodicity, entropy, and equilibration, highlighting the deep connection between these two seemingly distinct fields.
arXiv Detail & Related papers (2023-08-10T16:01:07Z) - Reliable AI: Does the Next Generation Require Quantum Computing? [71.84486326350338]
We show that digital hardware is inherently constrained in solving problems about optimization, deep learning, or differential equations.
In contrast, analog computing models, such as the Blum-Shub-Smale machine, exhibit the potential to surmount these limitations.
arXiv Detail & Related papers (2023-07-03T19:10:45Z) - A full-stack view of probabilistic computing with p-bits: devices,
architectures and algorithms [0.014319921806060482]
We provide a full-stack review of probabilistic computing with p-bits.
We argue that p-bits could be used to build energy-efficient probabilistic systems.
We outline the main applications of probabilistic computers ranging from machine learning to AI.
arXiv Detail & Related papers (2023-02-13T15:36:07Z) - Neurocompositional computing: From the Central Paradox of Cognition to a
new generation of AI systems [120.297940190903]
Recent progress in AI has resulted from the use of limited forms of neurocompositional computing.
New, deeper forms of neurocompositional computing create AI systems that are more robust, accurate, and comprehensible.
arXiv Detail & Related papers (2022-05-02T18:00:10Z) - Preparing thermal states on noiseless and noisy programmable quantum
processors [0.0]
We provide two quantum algorithms with provable guarantees to prepare thermal states on near-term quantum computers.
The first algorithm is inspired by the natural thermalization process where the ancilla qubits act as the infinite thermal bath.
The second algorithm works for any system and in general runs in exponential time.
arXiv Detail & Related papers (2021-12-29T18:06:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.