Generating Function for Tensor Network Diagrammatic Summation
- URL: http://arxiv.org/abs/2101.03935v2
- Date: Fri, 28 May 2021 15:41:01 GMT
- Title: Generating Function for Tensor Network Diagrammatic Summation
- Authors: Wei-Lin Tu, Huan-Kuang Wu, Norbert Schuch, Naoki Kawashima, Ji-Yao
Chen
- Abstract summary: We introduce a set of generating functions, which encode the diagrammatic summations as leading order series expansion coefficients.
We illustrate this scheme by computing variational excited states and dynamical structure factor of a quantum spin chain.
- Score: 0.24499092754102875
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The understanding of complex quantum many-body systems has been vastly
boosted by tensor network (TN) methods. Among others, excitation spectrum and
long-range interacting systems can be studied using TNs, where one however
confronts the intricate summation over an extensive number of tensor diagrams.
Here, we introduce a set of generating functions, which encode the diagrammatic
summations as leading order series expansion coefficients. Combined with
automatic differentiation, the generating function allows us to solve the
problem of TN diagrammatic summation. We illustrate this scheme by computing
variational excited states and dynamical structure factor of a quantum spin
chain, and further investigating entanglement properties of excited states.
Extensions to infinite size systems and higher dimension are outlined.
Related papers
- Dynamics and Geometry of Entanglement in Many-Body Quantum Systems [0.0]
A new framework is formulated to study entanglement dynamics in many-body quantum systems.
The Quantum Correlation Transfer Function (QCTF) is transformed into a new space of complex functions with isolated singularities.
The QCTF-based geometric description offers the prospect of theoretically revealing aspects of many-body entanglement.
arXiv Detail & Related papers (2023-08-18T19:16:44Z) - Generating function for projected entangled-pair states [0.1759252234439348]
We extend the generating function approach for tensor network diagrammatic summation.
Taking the form of a one-particle excitation, we show that the excited state can be computed efficiently in the generating function formalism.
We conclude with a discussion on generalizations to multi-particle excitations.
arXiv Detail & Related papers (2023-07-16T15:49:37Z) - Convolutions Through the Lens of Tensor Networks [2.6397379133308214]
We provide a new perspective onto convolutions through tensor networks (TNs)
TNs allow reasoning about the underlying tensor multiplications by drawing diagrams, manipulating them to perform function transformations, sub-tensor access, and fusion.
We demonstrate this expressive power by deriving the diagrams of various autodiff operations and popular approximations of second-order information.
arXiv Detail & Related papers (2023-07-05T13:19:41Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - A tensor network representation of path integrals: Implementation and
analysis [0.0]
We introduce a novel tensor network-based decomposition of path integral simulations involving Feynman-Vernon influence functional.
The finite temporarily non-local interactions introduced by the influence functional can be captured very efficiently using matrix product state representation.
The flexibility of the AP-TNPI framework makes it a promising new addition to the family of path integral methods for non-equilibrium quantum dynamics.
arXiv Detail & Related papers (2021-06-23T16:41:54Z) - Tensor Representations for Action Recognition [54.710267354274194]
Human actions in sequences are characterized by the complex interplay between spatial features and their temporal dynamics.
We propose novel tensor representations for capturing higher-order relationships between visual features for the task of action recognition.
We use higher-order tensors and so-called Eigenvalue Power Normalization (NEP) which have been long speculated to perform spectral detection of higher-order occurrences.
arXiv Detail & Related papers (2020-12-28T17:27:18Z) - Continuous-in-Depth Neural Networks [107.47887213490134]
We first show that ResNets fail to be meaningful dynamical in this richer sense.
We then demonstrate that neural network models can learn to represent continuous dynamical systems.
We introduce ContinuousNet as a continuous-in-depth generalization of ResNet architectures.
arXiv Detail & Related papers (2020-08-05T22:54:09Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Highly entangled spin chains and 2D quantum gravity [0.0]
Motzkin and Fredkin spin chains exhibit the extraordinary amount of entanglement scaling as a square-root of the volume.
We introduce large-N matrix models with so-called ABAB interactions, in which correlation functions reproduce the entanglement scaling in tree and planar Feynman diagrams.
arXiv Detail & Related papers (2020-05-01T07:43:57Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.