Tensor Networks Meet Neural Networks: A Survey and Future Perspectives
- URL: http://arxiv.org/abs/2302.09019v2
- Date: Mon, 8 May 2023 06:06:32 GMT
- Title: Tensor Networks Meet Neural Networks: A Survey and Future Perspectives
- Authors: Maolin Wang, Yu Pan, Zenglin Xu, Xiangli Yang, Guangxi Li, Andrzej
Cichocki
- Abstract summary: tensorial neural networks (TNNs) and neural networks (NNs) are two fundamental data modeling approaches.
TNs solve the curse of dimensionality in large-scale tensors by converting an exponential number of dimensions to complexity.
NNs have displayed exceptional performance in various applications, e.g., computer vision, natural language processing, and robotics research.
- Score: 27.878669143107885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tensor networks (TNs) and neural networks (NNs) are two fundamental data
modeling approaches. TNs were introduced to solve the curse of dimensionality
in large-scale tensors by converting an exponential number of dimensions to
polynomial complexity. As a result, they have attracted significant attention
in the fields of quantum physics and machine learning. Meanwhile, NNs have
displayed exceptional performance in various applications, e.g., computer
vision, natural language processing, and robotics research. Interestingly,
although these two types of networks originate from different observations,
they are inherently linked through the common multilinearity structure
underlying both TNs and NNs, thereby motivating a significant number of
intellectual developments regarding combinations of TNs and NNs. In this paper,
we refer to these combinations as tensorial neural networks (TNNs), and present
an introduction to TNNs in three primary aspects: network compression,
information fusion, and quantum circuit simulation. Furthermore, this survey
also explores methods for improving TNNs, examines flexible toolboxes for
implementing TNNs, and documents TNN development while highlighting potential
future directions. To the best of our knowledge, this is the first
comprehensive survey that bridges the connections among NNs, TNs, and quantum
circuits. We provide a curated list of TNNs at
\url{https://github.com/tnbar/awesome-tensorial-neural-networks}.
Related papers
- CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks [76.53016529061821]
Liquid Quantum Neural Network (LQNet) and Continuous Time Recurrent Quantum Neural Network (CTRQNet) developed.
LQNet and CTRQNet achieve accuracy increases as high as 40% on CIFAR 10 through binary classification.
arXiv Detail & Related papers (2024-08-28T00:56:03Z) - From Graphs to Qubits: A Critical Review of Quantum Graph Neural Networks [56.51893966016221]
Quantum Graph Neural Networks (QGNNs) represent a novel fusion of quantum computing and Graph Neural Networks (GNNs)
This paper critically reviews the state-of-the-art in QGNNs, exploring various architectures.
We discuss their applications across diverse fields such as high-energy physics, molecular chemistry, finance and earth sciences, highlighting the potential for quantum advantage.
arXiv Detail & Related papers (2024-08-12T22:53:14Z) - Deep Neural Networks via Complex Network Theory: a Perspective [3.1023851130450684]
Deep Neural Networks (DNNs) can be represented as graphs whose links and vertices iteratively process data and solve tasks sub-optimally. Complex Network Theory (CNT), merging statistical physics with graph theory, provides a method for interpreting neural networks by analysing their weights and neuron structures.
In this work, we extend the existing CNT metrics with measures that sample from the DNNs' training distribution, shifting from a purely topological analysis to one that connects with the interpretability of deep learning.
arXiv Detail & Related papers (2024-04-17T08:42:42Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - Deep Neural Networks as Complex Networks [1.704936863091649]
We use Complex Network Theory to represent Deep Neural Networks (DNNs) as directed weighted graphs.
We introduce metrics to study DNNs as dynamical systems, with a granularity that spans from weights to layers, including neurons.
We show that our metrics discriminate low vs. high performing networks.
arXiv Detail & Related papers (2022-09-12T16:26:04Z) - Universal approximation property of invertible neural networks [76.95927093274392]
Invertible neural networks (INNs) are neural network architectures with invertibility by design.
Thanks to their invertibility and the tractability of Jacobian, INNs have various machine learning applications such as probabilistic modeling, generative modeling, and representation learning.
arXiv Detail & Related papers (2022-04-15T10:45:26Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - Explore the Knowledge contained in Network Weights to Obtain Sparse
Neural Networks [2.649890751459017]
This paper proposes a novel learning approach to obtain sparse fully connected layers in neural networks (NNs) automatically.
We design a switcher neural network (SNN) to optimize the structure of the task neural network (TNN)
arXiv Detail & Related papers (2021-03-26T11:29:40Z) - Spiking Neural Networks -- Part I: Detecting Spatial Patterns [38.518936229794214]
Spiking Neural Networks (SNNs) are biologically inspired machine learning models that build on dynamic neuronal models processing binary and sparse spiking signals in an event-driven, online, fashion.
SNNs can be implemented on neuromorphic computing platforms that are emerging as energy-efficient co-processors for learning and inference.
arXiv Detail & Related papers (2020-10-27T11:37:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.