Matrix Product States with Backflow correlations
- URL: http://arxiv.org/abs/2201.00810v1
- Date: Mon, 3 Jan 2022 18:57:29 GMT
- Title: Matrix Product States with Backflow correlations
- Authors: Guglielmo Lami, Giuseppe Carleo, Mario Collura
- Abstract summary: We introduce a novel tensor network ansatz which extend the Matrix Product State representation of a quantum-many body wave function.
We benchmark the new ansatz against spin models both in one and two dimensions, demonstrating high accuracy and precision.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: By taking inspiration from the backflow transformation for correlated
systems, we introduce a novel tensor network ansatz which extend the
well-established Matrix Product State representation of a quantum-many body
wave function. This new structure provides enough resources to ensure that
states in dimension larger or equal than one obey an area law for entanglement.
It can be efficiently manipulated to address the ground-state search problem by
means of an optimization scheme which mixes tensor-network and variational
Monte-Carlo algorithms. We benchmark the new ansatz against spin models both in
one and two dimensions, demonstrating high accuracy and precision. We finally
employ our approach to study the challenging $S=1/2$ two dimensional $J_1 -
J_2$ model, demonstrating that it is competitive with the state of the art
methods in 2D.
Related papers
- Two dimensional quantum lattice models via mode optimized hybrid CPU-GPU density matrix renormalization group method [0.0]
We present a hybrid numerical approach to simulate quantum many body problems on two spatial dimensional quantum lattice models.
We demonstrate for the two dimensional spinless fermion model and for the Hubbard model on torus geometry that several orders of magnitude in computational time can be saved.
arXiv Detail & Related papers (2023-11-23T17:07:47Z) - Generating function for projected entangled-pair states [0.1759252234439348]
We extend the generating function approach for tensor network diagrammatic summation.
Taking the form of a one-particle excitation, we show that the excited state can be computed efficiently in the generating function formalism.
We conclude with a discussion on generalizations to multi-particle excitations.
arXiv Detail & Related papers (2023-07-16T15:49:37Z) - Two Dimensional Isometric Tensor Networks on an Infinite Strip [1.2569180784533303]
We introduce the class ofisoTNS (isoTNS) for efficient simulation of 2D systems on finite square lattices.
We iteratively transform an infinite MPS representation of a 2D quantum state into a strip isoTNS and investigate the entanglement properties of the resulting state.
Finally, we introduce an infinite time-evolving block decimation algorithm (iTEBDsuperscript2) and use it to approximate the ground state of the 2D transverse field Ising model on lattices of infinite strip geometry.
arXiv Detail & Related papers (2022-11-25T19:00:06Z) - Joint Spatial-Temporal and Appearance Modeling with Transformer for
Multiple Object Tracking [59.79252390626194]
We propose a novel solution named TransSTAM, which leverages Transformer to model both the appearance features of each object and the spatial-temporal relationships among objects.
The proposed method is evaluated on multiple public benchmarks including MOT16, MOT17, and MOT20, and it achieves a clear performance improvement in both IDF1 and HOTA.
arXiv Detail & Related papers (2022-05-31T01:19:18Z) - Efficient Simulation of Dynamics in Two-Dimensional Quantum Spin Systems
with Isometric Tensor Networks [0.0]
We investigate the computational power of the recently introduced class of isometric tensor network states (isoTNSs)
We discuss several technical details regarding the implementation of isoTNSs-based algorithms and compare different disentanglers.
We compute the dynamical spin structure factor of 2D quantum spin systems for two paradigmatic models.
arXiv Detail & Related papers (2021-12-15T19:00:05Z) - ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks [86.37110868126548]
In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
arXiv Detail & Related papers (2021-02-16T04:07:13Z) - Efficient Tensor Network ansatz for high-dimensional quantum many-body
problems [0.0]
We introduce a novel tensor network structure augmenting the well-established Tree Network representation of a quantum many-body wave function.
We benchmark this novel approach against paradigmatic two-dimensional spin models demonstrating unprecedented precision and system sizes.
arXiv Detail & Related papers (2020-11-16T19:00:04Z) - Connecting Weighted Automata, Tensor Networks and Recurrent Neural
Networks through Spectral Learning [58.14930566993063]
We present connections between three models used in different research fields: weighted finite automata(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks.
We introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous vectors input.
arXiv Detail & Related papers (2020-10-19T15:28:00Z) - Controllable Orthogonalization in Training DNNs [96.1365404059924]
Orthogonality is widely used for training deep neural networks (DNNs) due to its ability to maintain all singular values of the Jacobian close to 1.
This paper proposes a computationally efficient and numerically stable orthogonalization method using Newton's iteration (ONI)
We show that our method improves the performance of image classification networks by effectively controlling the orthogonality to provide an optimal tradeoff between optimization benefits and representational capacity reduction.
We also show that ONI stabilizes the training of generative adversarial networks (GANs) by maintaining the Lipschitz continuity of a network, similar to spectral normalization (
arXiv Detail & Related papers (2020-04-02T10:14:27Z) - Multi-Objective Matrix Normalization for Fine-grained Visual Recognition [153.49014114484424]
Bilinear pooling achieves great success in fine-grained visual recognition (FGVC)
Recent methods have shown that the matrix power normalization can stabilize the second-order information in bilinear features.
We propose an efficient Multi-Objective Matrix Normalization (MOMN) method that can simultaneously normalize a bilinear representation.
arXiv Detail & Related papers (2020-03-30T08:40:35Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.