Automatic structural optimization of tree tensor networks
- URL: http://arxiv.org/abs/2209.03196v2
- Date: Wed, 21 Dec 2022 12:46:23 GMT
- Title: Automatic structural optimization of tree tensor networks
- Authors: Toshiya Hikihara, Hiroshi Ueda, Kouichi Okunishi, Kenji Harada,
Tomotoshi Nishino
- Abstract summary: We propose a TTN algorithm that enables us to automatically optimize the network structure by local reconnections of isometries.
We demonstrate that the entanglement structure embedded in the ground-state of the system can be efficiently visualized as a perfect binary tree in the optimized TTN.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tree tensor network (TTN) provides an essential theoretical framework for the
practical simulation of quantum many-body systems, where the network structure
defined by the connectivity of the isometry tensors plays a crucial role in
improving its approximation accuracy. In this paper, we propose a TTN algorithm
that enables us to automatically optimize the network structure by local
reconnections of isometries to suppress the bipartite entanglement entropy on
their legs. The algorithm can be seamlessly implemented to such a conventional
TTN approach as density-matrix renormalization group. We apply the algorithm to
the inhomogeneous antiferromagnetic Heisenberg spin chain having a hierarchical
spatial distribution of the interactions. We then demonstrate that the
entanglement structure embedded in the ground-state of the system can be
efficiently visualized as a perfect binary tree in the optimized TTN. Possible
improvements and applications of the algorithm are also discussed.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Automatic Structural Search of Tensor Network States including Entanglement Renormalization [0.0]
Network states, including entanglement renormalization, can encompass a wider variety of entangled states.
A proposal has yet to show a structural search of ER due to its high computational cost and the lack of flexibility in its algorithm.
In this study, we conducted an optimal structural search of TN, including ER, based on the reconstruction of their local structures with respect to variational energy.
arXiv Detail & Related papers (2024-05-10T15:24:10Z) - Visualization of Entanglement Geometry by Structural Optimization of
Tree Tensor Network [0.0]
We propose a structural optimization algorithm for tree-tensor networks.
We show that the algorithm can successfully visualize the spatial pattern of spin-singlet pairs in the ground state.
arXiv Detail & Related papers (2024-01-29T09:39:24Z) - Entanglement bipartitioning and tree tensor networks [0.0]
We propose an entanglement bipartitioning approach to design an optimal network structure of the tree-tensor-network (TTN) for quantum many-body systems.
We demonstrate that entanglement bipartitioning of up to 16 sites gives rise to nontrivial tree network structures for $S=1/2$ Heisenberg models in one and two dimensions.
arXiv Detail & Related papers (2022-10-21T05:36:03Z) - Block-Structured Optimization for Subgraph Detection in Interdependent
Networks [29.342611925278643]
We design an effective, efficient, parallelizable algorithm, namely Block-structured Graph Gradient Projection (GBGP), to optimize a general non-linear function subject to graph constraints.
We demonstrate how our framework can be applied to two very practical applications and conduct comprehensive experiments to show the effectiveness and efficiency of our proposed algorithm.
arXiv Detail & Related papers (2022-10-06T06:37:39Z) - Robust Training and Verification of Implicit Neural Networks: A
Non-Euclidean Contractive Approach [64.23331120621118]
This paper proposes a theoretical and computational framework for training and robustness verification of implicit neural networks.
We introduce a related embedded network and show that the embedded network can be used to provide an $ell_infty$-norm box over-approximation of the reachable sets of the original network.
We apply our algorithms to train implicit neural networks on the MNIST dataset and compare the robustness of our models with the models trained via existing approaches in the literature.
arXiv Detail & Related papers (2022-08-08T03:13:24Z) - Quantum-inspired event reconstruction with Tensor Networks: Matrix
Product States [0.0]
We show that Networks are ideal vehicles to connect quantum mechanical concepts to machine learning techniques.
We show that entanglement entropy can be used to interpret what a network learns.
arXiv Detail & Related papers (2021-06-15T18:00:02Z) - Learning Structures for Deep Neural Networks [99.8331363309895]
We propose to adopt the efficient coding principle, rooted in information theory and developed in computational neuroscience.
We show that sparse coding can effectively maximize the entropy of the output signals.
Our experiments on a public image classification dataset demonstrate that using the structure learned from scratch by our proposed algorithm, one can achieve a classification accuracy comparable to the best expert-designed structure.
arXiv Detail & Related papers (2021-05-27T12:27:24Z) - Data-Driven Random Access Optimization in Multi-Cell IoT Networks with
NOMA [78.60275748518589]
Non-orthogonal multiple access (NOMA) is a key technology to enable massive machine type communications (mMTC) in 5G networks and beyond.
In this paper, NOMA is applied to improve the random access efficiency in high-density spatially-distributed multi-cell wireless IoT networks.
A novel formulation of random channel access management is proposed, in which the transmission probability of each IoT device is tuned to maximize the geometric mean of users' expected capacity.
arXiv Detail & Related papers (2021-01-02T15:21:08Z) - Connecting Weighted Automata, Tensor Networks and Recurrent Neural
Networks through Spectral Learning [58.14930566993063]
We present connections between three models used in different research fields: weighted finite automata(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks.
We introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous vectors input.
arXiv Detail & Related papers (2020-10-19T15:28:00Z) - Controllable Orthogonalization in Training DNNs [96.1365404059924]
Orthogonality is widely used for training deep neural networks (DNNs) due to its ability to maintain all singular values of the Jacobian close to 1.
This paper proposes a computationally efficient and numerically stable orthogonalization method using Newton's iteration (ONI)
We show that our method improves the performance of image classification networks by effectively controlling the orthogonality to provide an optimal tradeoff between optimization benefits and representational capacity reduction.
We also show that ONI stabilizes the training of generative adversarial networks (GANs) by maintaining the Lipschitz continuity of a network, similar to spectral normalization (
arXiv Detail & Related papers (2020-04-02T10:14:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.