Visualization of Entanglement Geometry by Structural Optimization of
Tree Tensor Network
- URL: http://arxiv.org/abs/2401.16000v1
- Date: Mon, 29 Jan 2024 09:39:24 GMT
- Title: Visualization of Entanglement Geometry by Structural Optimization of
Tree Tensor Network
- Authors: Toshiya Hikihara, Hiroshi Ueda, Kouichi Okunishi, Kenji Harada,
Tomotoshi Nishino
- Abstract summary: We propose a structural optimization algorithm for tree-tensor networks.
We show that the algorithm can successfully visualize the spatial pattern of spin-singlet pairs in the ground state.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In tensor-network analysis of quantum many-body systems, it is of crucial
importance to employ a tensor network with a spatial structure suitable for
representing the state of interest. In the previous work [Hikihara et al.,
Phys. Rev. Research 5, 013031 (2023)], we proposed a structural optimization
algorithm for tree-tensor networks. In this paper, we apply the algorithm to
the Rainbow-chain model, which has a product state of singlet pairs between
spins separated by various distances as an approximate ground state. We then
demonstrate that the algorithm can successfully visualize the spatial pattern
of spin-singlet pairs in the ground state.
Related papers
- Hallmarks of Optimization Trajectories in Neural Networks: Directional Exploration and Redundancy [75.15685966213832]
We analyze the rich directional structure of optimization trajectories represented by their pointwise parameters.
We show that training only scalar batchnorm parameters some while into training matches the performance of training the entire network.
arXiv Detail & Related papers (2024-03-12T07:32:47Z) - Tensor Network Representation and Entanglement Spreading in Many-Body
Localized Systems: A Novel Approach [0.0]
A novel method has been devised to compute the Local Integrals of Motion for a one-dimensional many-body localized system.
A class of optimal unitary transformations is deduced in a tensor-network formalism to diagonalize the Hamiltonian of the specified system.
The efficiency of the method was assessed and found to be both fast and almost accurate.
arXiv Detail & Related papers (2023-12-13T14:28:45Z) - Entanglement bipartitioning and tree tensor networks [0.0]
We propose an entanglement bipartitioning approach to design an optimal network structure of the tree-tensor-network (TTN) for quantum many-body systems.
We demonstrate that entanglement bipartitioning of up to 16 sites gives rise to nontrivial tree network structures for $S=1/2$ Heisenberg models in one and two dimensions.
arXiv Detail & Related papers (2022-10-21T05:36:03Z) - Automatic structural optimization of tree tensor networks [0.0]
We propose a TTN algorithm that enables us to automatically optimize the network structure by local reconnections of isometries.
We demonstrate that the entanglement structure embedded in the ground-state of the system can be efficiently visualized as a perfect binary tree in the optimized TTN.
arXiv Detail & Related papers (2022-09-07T14:51:39Z) - DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks [6.5361928329696335]
We present novel embedding methods for a dynamic network based on higher order tensor decompositions for tensorial representations of the dynamic network.
We demonstrate the power and efficiency of our approach by comparing our algorithms' performance on the link prediction task against an array of current baseline methods.
arXiv Detail & Related papers (2021-03-12T04:36:42Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Connecting Weighted Automata, Tensor Networks and Recurrent Neural
Networks through Spectral Learning [58.14930566993063]
We present connections between three models used in different research fields: weighted finite automata(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks.
We introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous vectors input.
arXiv Detail & Related papers (2020-10-19T15:28:00Z) - A Deep-Unfolded Reference-Based RPCA Network For Video
Foreground-Background Separation [86.35434065681925]
This paper proposes a new deep-unfolding-based network design for the problem of Robust Principal Component Analysis (RPCA)
Unlike existing designs, our approach focuses on modeling the temporal correlation between the sparse representations of consecutive video frames.
Experimentation using the moving MNIST dataset shows that the proposed network outperforms a recently proposed state-of-the-art RPCA network in the task of video foreground-background separation.
arXiv Detail & Related papers (2020-10-02T11:40:09Z) - Optimization schemes for unitary tensor-network circuit [0.0]
We discuss the variational optimization of a unitary tensor-network circuit with different network structures.
The ansatz is performed based on a generalization of well-developed multi-scale entanglement renormalization algorithm.
We present the benchmarking calculations for different network structures.
arXiv Detail & Related papers (2020-09-05T21:57:28Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.