Supervised Learning with Projected Entangled Pair States
- URL: http://arxiv.org/abs/2009.09932v1
- Date: Sat, 12 Sep 2020 09:15:00 GMT
- Title: Supervised Learning with Projected Entangled Pair States
- Authors: Song Cheng, Lei Wang, Pan Zhang
- Abstract summary: We construct supervised learning models for images using the projected entangled pair states (PEPS)
PEPS is a two-dimensional tensor network having a similar structure prior to natural images.
Our results shed light on potential applications of two-dimensional tensor network models in machine learning.
- Score: 7.783244908334539
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tensor networks, a model that originated from quantum physics, has been
gradually generalized as efficient models in machine learning in recent years.
However, in order to achieve exact contraction, only tree-like tensor networks
such as the matrix product states and tree tensor networks have been
considered, even for modeling two-dimensional data such as images. In this
work, we construct supervised learning models for images using the projected
entangled pair states (PEPS), a two-dimensional tensor network having a similar
structure prior to natural images. Our approach first performs a feature map,
which transforms the image data to a product state on a grid, then contracts
the product state to a PEPS with trainable parameters to predict image labels.
The tensor elements of PEPS are trained by minimizing differences between
training labels and predicted labels. The proposed model is evaluated on image
classifications using the MNIST and the Fashion-MNIST datasets. We show that
our model is significantly superior to existing models using tree-like tensor
networks. Moreover, using the same input features, our method performs as well
as the multilayer perceptron classifier, but with much fewer parameters and is
more stable. Our results shed light on potential applications of
two-dimensional tensor network models in machine learning.
Related papers
- Neural Metamorphosis [72.88137795439407]
This paper introduces a new learning paradigm termed Neural Metamorphosis (NeuMeta), which aims to build self-morphable neural networks.
NeuMeta directly learns the continuous weight manifold of neural networks.
It sustains full-size performance even at a 75% compression rate.
arXiv Detail & Related papers (2024-10-10T14:49:58Z) - Vertical Layering of Quantized Neural Networks for Heterogeneous
Inference [57.42762335081385]
We study a new vertical-layered representation of neural network weights for encapsulating all quantized models into a single one.
We can theoretically achieve any precision network for on-demand service while only needing to train and maintain one model.
arXiv Detail & Related papers (2022-12-10T15:57:38Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Meta Internal Learning [88.68276505511922]
Internal learning for single-image generation is a framework, where a generator is trained to produce novel images based on a single image.
We propose a meta-learning approach that enables training over a collection of images, in order to model the internal statistics of the sample image more effectively.
Our results show that the models obtained are as suitable as single-image GANs for many common image applications.
arXiv Detail & Related papers (2021-10-06T16:27:38Z) - Patch-based medical image segmentation using Quantum Tensor Networks [1.5899411215927988]
We formulate image segmentation in a supervised setting with tensor networks.
The key idea is to first lift the pixels in image patches to exponentially high dimensional feature spaces.
The performance of the proposed model is evaluated on three 2D- and one 3D- biomedical imaging datasets.
arXiv Detail & Related papers (2021-09-15T07:54:05Z) - Tensor networks for unsupervised machine learning [9.897828174118974]
We present the Autoregressive Matrix Product States (AMPS), a tensor-network-based model combining the matrix product states from quantum many-body physics and the autoregressive models from machine learning.
We show that the proposed model significantly outperforms the existing tensor-network-based models and the restricted Boltzmann machines.
arXiv Detail & Related papers (2021-06-24T12:51:00Z) - ResMLP: Feedforward networks for image classification with
data-efficient training [73.26364887378597]
We present ResMLP, an architecture built entirely upon multi-layer perceptrons for image classification.
We will share our code based on the Timm library and pre-trained models.
arXiv Detail & Related papers (2021-05-07T17:31:44Z) - Segmenting two-dimensional structures with strided tensor networks [1.952097552284465]
We propose a novel formulation of tensor networks for supervised image segmentation.
The proposed model is end-to-end trainable using backpropagation.
The evaluation shows that the strided tensor network yields competitive performance compared to CNN-based models.
arXiv Detail & Related papers (2021-02-13T11:06:34Z) - Locally Masked Convolution for Autoregressive Models [107.4635841204146]
LMConv is a simple modification to the standard 2D convolution that allows arbitrary masks to be applied to the weights at each location in the image.
We learn an ensemble of distribution estimators that share parameters but differ in generation order, achieving improved performance on whole-image density estimation.
arXiv Detail & Related papers (2020-06-22T17:59:07Z) - Anomaly Detection with Tensor Networks [2.3895981099137535]
We exploit the memory and computational efficiency of tensor networks to learn a linear transformation over a space with a dimension exponential in the number of original features.
We produce competitive results on image datasets, despite not exploiting the locality of images.
arXiv Detail & Related papers (2020-06-03T20:41:30Z) - Tensor Networks for Medical Image Classification [0.456877715768796]
We focus on the class of Networks, which has been a work horse for physicists in the last two decades to analyse quantum many-body systems.
We extend the Matrix Product State tensor networks to be useful in medical image analysis tasks.
We show that tensor networks are capable of attaining performance that is comparable to state-of-the-art deep learning methods.
arXiv Detail & Related papers (2020-04-21T15:02:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.