Towards a theory of quantum gravity from neural networks
- URL: http://arxiv.org/abs/2111.00903v1
- Date: Thu, 28 Oct 2021 12:39:01 GMT
- Title: Towards a theory of quantum gravity from neural networks
- Authors: Vitaly Vanchurin
- Abstract summary: We show that the non-equilibrium dynamics of trainable variables can be described by the Madelung equations.
We argue that the Lorentz symmetries and curved space-time can emerge from the interplay between entropy production and entropy destruction due to learning.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural network is a dynamical system described by two different types of
degrees of freedom: fast-changing non-trainable variables (e.g. state of
neurons) and slow-changing trainable variables (e.g. weights and biases). We
show that the non-equilibrium dynamics of trainable variables can be described
by the Madelung equations, if the number of neurons is fixed, and by the
Schrodinger equation, if the learning system is capable of adjusting its own
parameters such as the number of neurons, step size and mini-batch size. We
argue that the Lorentz symmetries and curved space-time can emerge from the
interplay between stochastic entropy production and entropy destruction due to
learning. We show that the non-equilibrium dynamics of non-trainable variables
can be described by the geodesic equation (in the emergent space-time) for
localized states of neurons, and by the Einstein equations (with cosmological
constant) for the entire network. We conclude that the quantum description of
trainable variables and the gravitational description of non-trainable
variables are dual in the sense that they provide alternative macroscopic
descriptions of the same learning system, defined microscopically as a neural
network.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Dataset-learning duality and emergent criticality [0.0]
We show a duality map between a subspace of non-trainable variables and a subspace of trainable variables.
We use the duality to study the emergence of criticality, or the power-law distributions of fluctuations of the trainable variables.
arXiv Detail & Related papers (2024-05-27T17:44:33Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Parametrized constant-depth quantum neuron [56.51261027148046]
We propose a framework that builds quantum neurons based on kernel machines.
We present here a neuron that applies a tensor-product feature mapping to an exponentially larger space.
It turns out that parametrization allows the proposed neuron to optimally fit underlying patterns that the existing neuron cannot fit.
arXiv Detail & Related papers (2022-02-25T04:57:41Z) - Self-organized criticality in neural networks [0.0]
We show that learning dynamics of neural networks is generically attracted towards a self-organized critical state.
Our results support the claim that the universe might be a neural network.
arXiv Detail & Related papers (2021-07-07T18:00:03Z) - Emergent Quantumness in Neural Networks [0.0]
We derive the Schr"odinger equation with "Planck's constant" determined by the chemical potential of hidden variables.
We also discuss implications of the results for machine learning, fundamental physics and, in a more speculative way, evolutionary biology.
arXiv Detail & Related papers (2020-12-09T14:32:33Z) - The world as a neural network [0.0]
We discuss a possibility that the universe on its most fundamental level is a neural network.
We identify two different types of dynamical degrees of freedom: "trainable" variables and "hidden" variables.
We argue that the entropy production in such a system is a local function of the symmetries of the Onsager-Hilbert term.
arXiv Detail & Related papers (2020-08-04T17:10:46Z) - Neuromorphic quantum computing [0.0]
We propose that neuromorphic computing can perform quantum operations.
We show for a two qubit system that quantum gates can be learned as a change of parameters for neural network dynamics.
arXiv Detail & Related papers (2020-05-04T14:46:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.