Imaginary components of out-of-time correlators and information
scrambling for navigating the learning landscape of a quantum machine
learning model
- URL: http://arxiv.org/abs/2208.13384v2
- Date: Sun, 15 Jan 2023 00:09:31 GMT
- Title: Imaginary components of out-of-time correlators and information
scrambling for navigating the learning landscape of a quantum machine
learning model
- Authors: Manas Sajjan, Vinit Singh, Raja Selvarajan, Sabre Kais
- Abstract summary: We analytically illustrate that hitherto unexplored imaginary components of out-of-time correlators can provide unprecedented insight into the information scrambling capacity of a graph neural network.
Such an analysis demystifies the training of quantum machine learning models by unraveling how quantum information is scrambled through such a network.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce and analytically illustrate that hitherto unexplored imaginary
components of out-of-time correlators can provide unprecedented insight into
the information scrambling capacity of a graph neural network. Furthermore, we
demonstrate that it can be related to conventional measures of correlation like
quantum mutual information and rigorously establish the inherent mathematical
bounds (both upper and lower bound) jointly shared by such seemingly disparate
quantities. To consolidate the geometrical ramifications of such bounds during
the dynamical evolution of training we thereafter construct an emergent convex
space. This newly designed space offers much surprising information including
the saturation of lower bound by the trained network even for physical systems
of large sizes, transference, and quantitative mirroring of spin correlation
from the simulated physical system across phase boundaries as desirable
features within the latent sub-units of the network (even though the latent
units are directly oblivious to the simulated physical system) and the ability
of the network to distinguish exotic spin connectivity(volume-law vs area law).
Such an analysis demystifies the training of quantum machine learning models by
unraveling how quantum information is scrambled through such a network
introducing correlation surreptitiously among its constituent sub-systems and
open a window into the underlying physical mechanism behind the emulative
ability of the model.
Related papers
- Demolition and Reinforcement of Memories in Spin-Glass-like Neural
Networks [0.0]
The aim of this thesis is to understand the effectiveness of Unlearning in both associative memory models and generative models.
The selection of structured data enables an associative memory model to retrieve concepts as attractors of a neural dynamics with considerable basins of attraction.
A novel regularization technique for Boltzmann Machines is presented, proving to outperform previously developed methods in learning hidden probability distributions from data-sets.
arXiv Detail & Related papers (2024-03-04T23:12:42Z) - Inferring Relational Potentials in Interacting Systems [56.498417950856904]
We propose Neural Interaction Inference with Potentials (NIIP) as an alternative approach to discover such interactions.
NIIP assigns low energy to the subset of trajectories which respect the relational constraints observed.
It allows trajectory manipulation, such as interchanging interaction types across separately trained models, as well as trajectory forecasting.
arXiv Detail & Related papers (2023-10-23T00:44:17Z) - Exponential Quantum Communication Advantage in Distributed Inference and Learning [19.827903766111987]
We present a framework for distributed computation over a quantum network.
We show that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication.
We also show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth.
arXiv Detail & Related papers (2023-10-11T02:19:50Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Deep learning of spatial densities in inhomogeneous correlated quantum
systems [0.0]
We show that we can learn to predict densities using convolutional neural networks trained on random potentials.
We show that our approach can handle well the interplay of interference and interactions and the behaviour of models with phase transitions in inhomogeneous situations.
arXiv Detail & Related papers (2022-11-16T17:10:07Z) - Collisional open quantum dynamics with a generally correlated
environment: Exact solvability in tensor networks [0.0]
We find a natural Markovian embedding for the system dynamics, where the role of an auxiliary system is played by virtual indices of the network.
The results advance tensor-network methods in the fields of quantum optics and quantum transport.
arXiv Detail & Related papers (2022-02-09T19:48:17Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Tracing Information Flow from Open Quantum Systems [52.77024349608834]
We use photons in a waveguide array to implement a quantum simulation of the coupling of a qubit with a low-dimensional discrete environment.
Using the trace distance between quantum states as a measure of information, we analyze different types of information transfer.
arXiv Detail & Related papers (2021-03-22T16:38:31Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - 'Place-cell' emergence and learning of invariant data with restricted
Boltzmann machines: breaking and dynamical restoration of continuous
symmetries in the weight space [0.0]
We study the learning dynamics of Restricted Boltzmann Machines (RBM), a neural network paradigm for representation learning.
As learning proceeds from a random configuration of the network weights, we show the existence of a symmetry-breaking phenomenon.
This symmetry-breaking phenomenon takes place only if the amount of data available for training exceeds some critical value.
arXiv Detail & Related papers (2019-12-30T14:37:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.