The Augmented Tree Tensor Network Cookbook
- URL: http://arxiv.org/abs/2507.21236v1
- Date: Mon, 28 Jul 2025 18:00:39 GMT
- Title: The Augmented Tree Tensor Network Cookbook
- Authors: Nora Reinić, Luka Pavešić, Daniel Jaschke, Simone Montangero,
- Abstract summary: An augmented tree tensor network (aTTN) is a tensor network ansatz constructed by applying a layer of unitary disentanglers to a tree tensor network.<n>These lecture notes serve as a detailed guide for implementing the aTTN algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: An augmented tree tensor network (aTTN) is a tensor network ansatz constructed by applying a layer of unitary disentanglers to a tree tensor network. The disentanglers absorb a part of the system's entanglement. This makes aTTNs suitable for simulating higher-dimensional lattices, where the entanglement increases with the lattice size even for states that obey the area law. These lecture notes serve as a detailed guide for implementing the aTTN algorithms. We present a variational algorithm for ground state search and discuss the measurement of observables, and offer an open-source implementation within the Quantum TEA library. We benchmark the performance of the ground state search for different parameters and hyperparameters in the square lattice quantum Ising model and the triangular lattice Heisenberg model for up to $32 \times 32$ spins. The benchmarks identify the regimes where the aTTNs offer advantages in accuracy relative to computational cost compared to matrix product states and tree tensor networks.
Related papers
- Tensor Decomposition Networks for Fast Machine Learning Interatomic Potential Computations [63.945006006152035]
tensor decomposition networks (TDNs) achieve competitive performance with dramatic speedup in computations.<n>We evaluate TDNs on PubChemQCR, a newly curated molecular relaxation dataset containing 105 million DFT-calculated snapshots.
arXiv Detail & Related papers (2025-07-01T18:46:27Z) - TTNOpt: Tree tensor network package for high-rank tensor compression [0.0]
TTNOpt is a software package that utilizes tree tensor networks (TTNs) for quantum spin systems and high-dimensional data analysis.<n>For quantum spin systems, TTNOpt searches for the ground state of Hamiltonians with bilinear spin interactions and magnetic fields.<n>For high-dimensional data analysis, TTNOpt factorizes complex tensors into TTN states that maximize fidelity to the original tensors.
arXiv Detail & Related papers (2025-05-09T09:28:38Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Tensor Ring Optimized Quantum-Enhanced Tensor Neural Networks [32.76948546010625]
Quantum machine learning researchers often rely on incorporating Networks (TN) into Deep Neural Networks (DNN)
To address this issue, a multi-layer design of a Ring optimized variational Quantum learning classifier (Quan-TR) is proposed.
It is referred to as Ring optimized Quantum-enhanced neural Networks (TR-QNet)
On quantum simulations, the proposed TR-QNet achieves promising accuracy of $94.5%$, $86.16%$, and $83.54%$ on the Iris, MNIST, and CIFAR-10 datasets, respectively.
arXiv Detail & Related papers (2023-10-02T18:07:10Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - Entanglement bipartitioning and tree tensor networks [0.0]
We propose an entanglement bipartitioning approach to design an optimal network structure of the tree-tensor-network (TTN) for quantum many-body systems.
We demonstrate that entanglement bipartitioning of up to 16 sites gives rise to nontrivial tree network structures for $S=1/2$ Heisenberg models in one and two dimensions.
arXiv Detail & Related papers (2022-10-21T05:36:03Z) - Robust Training and Verification of Implicit Neural Networks: A
Non-Euclidean Contractive Approach [64.23331120621118]
This paper proposes a theoretical and computational framework for training and robustness verification of implicit neural networks.
We introduce a related embedded network and show that the embedded network can be used to provide an $ell_infty$-norm box over-approximation of the reachable sets of the original network.
We apply our algorithms to train implicit neural networks on the MNIST dataset and compare the robustness of our models with the models trained via existing approaches in the literature.
arXiv Detail & Related papers (2022-08-08T03:13:24Z) - Tensor Network States with Low-Rank Tensors [6.385624548310884]
We introduce the idea of imposing low-rank constraints on the tensors that compose the tensor network.
With this modification, the time and complexities for the network optimization can be substantially reduced.
We find that choosing the tensor rank $r$ to be on the order of the bond $m$, is sufficient to obtain high-accuracy groundstate approximations.
arXiv Detail & Related papers (2022-05-30T17:58:16Z) - On the Neural Tangent Kernel Analysis of Randomly Pruned Neural Networks [91.3755431537592]
We study how random pruning of the weights affects a neural network's neural kernel (NTK)
In particular, this work establishes an equivalence of the NTKs between a fully-connected neural network and its randomly pruned version.
arXiv Detail & Related papers (2022-03-27T15:22:19Z) - Efficient Simulation of Dynamics in Two-Dimensional Quantum Spin Systems
with Isometric Tensor Networks [0.0]
We investigate the computational power of the recently introduced class of isometric tensor network states (isoTNSs)
We discuss several technical details regarding the implementation of isoTNSs-based algorithms and compare different disentanglers.
We compute the dynamical spin structure factor of 2D quantum spin systems for two paradigmatic models.
arXiv Detail & Related papers (2021-12-15T19:00:05Z) - Adaptive-weighted tree tensor networks for disordered quantum many-body
systems [0.0]
We introduce an adaptive-weighted tree tensor network, for the study of disordered and inhomogeneous quantum many-body systems.
We compute the ground state of the two-dimensional quantum Ising model in the presence of quenched random disorder and frustration.
arXiv Detail & Related papers (2021-11-24T10:32:28Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.