Spherical and Hyperbolic Toric Topology-Based Codes On Graph Embedding
for Ising MRF Models: Classical and Quantum Topology Machine Learning
- URL: http://arxiv.org/abs/2307.15778v2
- Date: Tue, 5 Sep 2023 19:35:25 GMT
- Title: Spherical and Hyperbolic Toric Topology-Based Codes On Graph Embedding
for Ising MRF Models: Classical and Quantum Topology Machine Learning
- Authors: Vasiliy Usatyuk, Sergey Egorov, Denis Sapozhnikov
- Abstract summary: The paper introduces the application of information geometry to describe the ground states of Ising models.
The approach establishes a connection between machine learning and error-correcting coding.
- Score: 0.11805137592431453
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The paper introduces the application of information geometry to describe the
ground states of Ising models by utilizing parity-check matrices of cyclic and
quasi-cyclic codes on toric and spherical topologies. The approach establishes
a connection between machine learning and error-correcting coding. This
proposed approach has implications for the development of new embedding methods
based on trapping sets. Statistical physics and number geometry applied for
optimize error-correcting codes, leading to these embedding and sparse
factorization methods. The paper establishes a direct connection between DNN
architecture and error-correcting coding by demonstrating how state-of-the-art
architectures (ChordMixer, Mega, Mega-chunk, CDIL, ...) from the long-range
arena can be equivalent to of block and convolutional LDPC codes (Cage-graph,
Repeat Accumulate). QC codes correspond to certain types of chemical elements,
with the carbon element being represented by the mixed automorphism
Shu-Lin-Fossorier QC-LDPC code. The connections between Belief Propagation and
the Permanent, Bethe-Permanent, Nishimori Temperature, and Bethe-Hessian Matrix
are elaborated upon in detail. The Quantum Approximate Optimization Algorithm
(QAOA) used in the Sherrington-Kirkpatrick Ising model can be seen as analogous
to the back-propagation loss function landscape in training DNNs. This
similarity creates a comparable problem with TS pseudo-codeword, resembling the
belief propagation method. Additionally, the layer depth in QAOA correlates to
the number of decoding belief propagation iterations in the Wiberg decoding
tree. Overall, this work has the potential to advance multiple fields, from
Information Theory, DNN architecture design (sparse and structured prior graph
topology), efficient hardware design for Quantum and Classical DPU/TPU (graph,
quantize and shift register architect.) to Materials Science and beyond.
Related papers
- Can Geometric Quantum Machine Learning Lead to Advantage in Barcode Classification? [16.34646723046073]
We develop a geometric quantum machine learning (GQML) approach with embedded symmetries.
We show that quantum networks largely outperform their classical counterparts.
While the ability to achieve advantage largely depends on how data are loaded, we discuss how similar problems can benefit from quantum machine learning.
arXiv Detail & Related papers (2024-09-02T23:34:52Z) - Sparse Concept Bottleneck Models: Gumbel Tricks in Contrastive Learning [86.15009879251386]
We propose a novel architecture and method of explainable classification with Concept Bottleneck Models (CBM)
CBMs require an additional set of concepts to leverage.
We show a significant increase in accuracy using sparse hidden layers in CLIP-based bottleneck models.
arXiv Detail & Related papers (2024-04-04T09:43:43Z) - Linear Codes for Hyperdimensional Computing [9.7902367664742]
We show that random linear codes offer a rich subcode structure that can be used to form key-value stores.
We show that under the framework we develop, random linear codes admit simple recovery algorithms to factor (either bundled or bound) compositional representations.
arXiv Detail & Related papers (2024-03-05T19:18:44Z) - Topology-Aware Exploration of Energy-Based Models Equilibrium: Toric
QC-LDPC Codes and Hyperbolic MET QC-LDPC Codes [0.11805137592431453]
We present a method for achieving equilibrium in the ISING Hamiltonian when confronted with unevenly distributed charges on an irregular grid.
Our approach involves dimensionally expanding the system, substituting charges with circulants, and representing distances through circulant shifts.
This results in a systematic mapping of the charge system onto a space, transforming the irregular grid into a uniform configuration.
arXiv Detail & Related papers (2024-01-26T10:14:10Z) - Disentanglement via Latent Quantization [60.37109712033694]
In this work, we construct an inductive bias towards encoding to and decoding from an organized latent space.
We demonstrate the broad applicability of this approach by adding it to both basic data-re (vanilla autoencoder) and latent-reconstructing (InfoGAN) generative models.
arXiv Detail & Related papers (2023-05-28T06:30:29Z) - QNEAT: Natural Evolution of Variational Quantum Circuit Architecture [95.29334926638462]
We focus on variational quantum circuits (VQC), which emerged as the most promising candidates for the quantum counterpart of neural networks.
Although showing promising results, VQCs can be hard to train because of different issues, e.g., barren plateau, periodicity of the weights, or choice of architecture.
We propose a gradient-free algorithm inspired by natural evolution to optimize both the weights and the architecture of the VQC.
arXiv Detail & Related papers (2023-04-14T08:03:20Z) - Understanding the Mapping of Encode Data Through An Implementation of
Quantum Topological Analysis [0.7106986689736827]
We show the difference in encoding techniques can be visualized by investigating the topology of the data embedded in complex Hilbert space.
Our results suggest the encoding method needs to be considered carefully within different quantum machine learning models.
arXiv Detail & Related papers (2022-09-21T18:46:08Z) - Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach [47.19265172105025]
We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
arXiv Detail & Related papers (2022-01-13T16:35:45Z) - Neural Distributed Source Coding [59.630059301226474]
We present a framework for lossy DSC that is agnostic to the correlation structure and can scale to high dimensions.
We evaluate our method on multiple datasets and show that our method can handle complex correlations and state-of-the-art PSNR.
arXiv Detail & Related papers (2021-06-05T04:50:43Z) - Belief Propagation Reloaded: Learning BP-Layers for Labeling Problems [83.98774574197613]
We take one of the simplest inference methods, a truncated max-product Belief propagation, and add what is necessary to make it a proper component of a deep learning model.
This BP-Layer can be used as the final or an intermediate block in convolutional neural networks (CNNs)
The model is applicable to a range of dense prediction problems, is well-trainable and provides parameter-efficient and robust solutions in stereo, optical flow and semantic segmentation.
arXiv Detail & Related papers (2020-03-13T13:11:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.