Fermionic tensor network methods
- URL: http://arxiv.org/abs/2404.14611v2
- Date: Thu, 21 Nov 2024 16:14:26 GMT
- Title: Fermionic tensor network methods
- Authors: Quinten Mortier, Lukas Devos, Lander Burgelman, Bram Vanhecke, Nick Bultinck, Frank Verstraete, Jutho Haegeman, Laurens Vanderstraeten,
- Abstract summary: We show how fermionic statistics can be naturally incorporated in tensor networks on arbitrary graphs through the use of graded Hilbert spaces.
This formalism allows to use tensor network methods for fermionic lattice systems in a local way, avoiding the need of a Jordan-Wigner transformation or the explicit tracking of leg crossings by swap gates in 2D tensor networks.
- Score: 0.0
- License:
- Abstract: We show how fermionic statistics can be naturally incorporated in tensor networks on arbitrary graphs through the use of graded Hilbert spaces. This formalism allows to use tensor network methods for fermionic lattice systems in a local way, avoiding the need of a Jordan-Wigner transformation or the explicit tracking of leg crossings by swap gates in 2D tensor networks. The graded Hilbert spaces can be readily integrated with other internal and lattice symmetries in tensor networks, and only require minor extensions to an existing tensor network software package. We review and benchmark the fermionic versions of common algorithms for matrix product states and projected entangled-pair states.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - One-step replica symmetry breaking in the language of tensor networks [0.913755431537592]
We develop an exact mapping between the one-step replica symmetry breaking cavity method and tensor networks.
The two schemes come with complementary mathematical and numerical toolboxes that could be leveraged to improve the respective states of the art.
arXiv Detail & Related papers (2023-06-26T18:42:51Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Random tensor networks with nontrivial links [1.9440833697222828]
We initiate a systematic study of the entanglement properties of random tensor networks.
We employ tools from free probability, random matrix theory, and one-shot quantum information theory.
We draw connections to previous work on split transfer protocols, entanglement negativity in random tensor networks, and Euclidean path integrals in quantum gravity.
arXiv Detail & Related papers (2022-06-21T15:49:29Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Quantum Annealing Algorithms for Boolean Tensor Networks [0.0]
We introduce and analyze three general algorithms for Boolean tensor networks.
We show can be expressed as a quadratic unconstrained binary optimization problem suitable for solving on a quantum annealer.
We demonstrate that tensor with up to millions of elements can be decomposed efficiently using a DWave 2000Q quantum annealer.
arXiv Detail & Related papers (2021-07-28T22:38:18Z) - Optimization at the boundary of the tensor network variety [2.1839191255085995]
tensor network states form a variational ansatz class widely used in the study of quantum many-body systems.
Recent work has shown that states on the boundary of this variety can yield more efficient representations for states of physical interest.
We show how to optimize over this class in order to find ground states of local Hamiltonians.
arXiv Detail & Related papers (2020-06-30T16:58:55Z) - Solving frustrated Ising models using tensor networks [0.0]
We develop a framework to study frustrated Ising models in terms of infinite tensor networks %.
We show that optimizing the choice of clusters, including the weight on shared bonds, is crucial for the contractibility of the tensor networks.
We illustrate the power of the method by computing the residual entropy of a frustrated Ising spin system on the kagome lattice with next-next-nearest neighbour interactions.
arXiv Detail & Related papers (2020-06-25T12:39:42Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.