Deep Neural Networks as the Semi-classical Limit of Topological Quantum
Neural Networks: The problem of generalisation
- URL: http://arxiv.org/abs/2210.13741v1
- Date: Tue, 25 Oct 2022 03:14:59 GMT
- Title: Deep Neural Networks as the Semi-classical Limit of Topological Quantum
Neural Networks: The problem of generalisation
- Authors: Antonino Marciano, Deen Chen, Filippo Fabrocini, Chris Fields, Matteo
Lulli and Emanuele Zappala
- Abstract summary: We propose a framework for understanding the problem of generalization in Deep Neural Networks.
Deep Neural Networks are viewed as the semi-classical limit of Topological Quantum Neural Networks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep Neural Networks miss a principled model of their operation. A novel
framework for supervised learning based on Topological Quantum Field Theory
that looks particularly well suited for implementation on quantum processors
has been recently explored. We propose the use of this framework for
understanding the problem of generalization in Deep Neural Networks. More
specifically, in this approach Deep Neural Networks are viewed as the
semi-classical limit of Topological Quantum Neural Networks. A framework of
this kind explains easily the overfitting behavior of Deep Neural Networks
during the training step and the corresponding generalization capabilities.
Related papers
- A General Approach to Dropout in Quantum Neural Networks [1.5771347525430772]
"Overfitting" is the phenomenon occurring when a given model learns the training data excessively well.
With the advent of Quantum Neural Networks as learning models, overfitting might soon become an issue.
arXiv Detail & Related papers (2023-10-06T09:39:30Z) - On the Interpretability of Quantum Neural Networks [0.0]
Interpretability of artificial intelligence (AI) methods, particularly deep neural networks, is of great interest.
Here, we explore the interpretability of quantum neural networks using local model-agnostic interpretability measures commonly utilized for classical neural networks.
A feature of our explanations is the delineation of the region in which data samples have been given a random label, likely subjects of inherently random quantum measurements.
arXiv Detail & Related papers (2023-08-22T00:43:14Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Lecture Notes: Neural Network Architectures [0.0]
These lecture notes provide an overview of Neural Network architectures from a mathematical point of view.
Covered are an introduction to Neural Networks and the following architectures: Feedforward Neural Network, Convolutional Neural Network, ResNet, and Recurrent Neural Network.
arXiv Detail & Related papers (2023-04-11T10:54:36Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Mathematical Models of Overparameterized Neural Networks [25.329225766892126]
We will focus on the analysis of two-layer neural networks, and explain the key mathematical models.
We will then discuss challenges in understanding deep neural networks and some current research directions.
arXiv Detail & Related papers (2020-12-27T17:48:31Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z) - Deep Neural Networks as the Semi-classical Limit of Quantum Neural
Networks [0.0]
Quantum Neural Networks (QNN) can be mapped onto spinnetworks.
Deep Neural Networks (DNN) are a subcase of QNN.
A number of Machine Learning (ML) key-concepts can be rephrased by using the terminology of Topological Quantum Field Theories (TQFT)
arXiv Detail & Related papers (2020-06-30T22:47:26Z) - Entanglement Classification via Neural Network Quantum States [58.720142291102135]
In this paper we combine machine-learning tools and the theory of quantum entanglement to perform entanglement classification for multipartite qubit systems in pure states.
We use a parameterisation of quantum systems using artificial neural networks in a restricted Boltzmann machine (RBM) architecture, known as Neural Network Quantum States (NNS)
arXiv Detail & Related papers (2019-12-31T07:40:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.