Reduced Contraction Costs of Corner-Transfer Methods for PEPS
- URL: http://arxiv.org/abs/2306.08212v1
- Date: Wed, 14 Jun 2023 02:54:12 GMT
- Title: Reduced Contraction Costs of Corner-Transfer Methods for PEPS
- Authors: Wangwei Lan, Glen Evenbly
- Abstract summary: We propose a pair of approximations that allows the leading order computational cost of contracting an infinite projected entangled-pair state to be reduced.
The improvement in computational cost enables us to perform large bond dimension calculations, extending its potential to solve challenging problems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a pair of approximations that allows the leading order
computational cost of contracting an infinite projected entangled-pair state
(iPEPS) to be reduced from $\mathcal{O}(\chi^3D^6)$ to $\mathcal{O}(\chi^3D^3)$
when using a corner-transfer approach. The first approximation involves (i)
reducing the environment needed for truncation of the boundary tensors (ii)
relies on the sequential contraction and truncation of bra and ket indices,
rather than doing both together as with the established algorithm. To verify
the algorithm, we perform benchmark simulations over square lattice Heisenberg
model and obtain results that are comparable to the standard iPEPS algorithm.
The improvement in computational cost enables us to perform large bond
dimension calculations, extending its potential to solve challenging problems.
Related papers
- Obtaining Lower Query Complexities through Lightweight Zeroth-Order Proximal Gradient Algorithms [65.42376001308064]
We propose two variance reduced ZO estimators for complex gradient problems.
We improve the state-of-the-art function complexities from $mathcalOleft(minfracdn1/2epsilon2, fracdepsilon3right)$ to $tildecalOleft(fracdepsilon2right)$.
arXiv Detail & Related papers (2024-10-03T15:04:01Z) - A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - $A^*$ for Graphs of Convex Sets [7.9756690088226145]
We present a novel algorithm that fuses the existing convex-programming based approach with information to find optimality guarantees.
Our method, inspired by $A*$, initiates a best-first-like procedure from a designated subset of vertices and iteratively expands it until further growth is neither possible nor beneficial.
arXiv Detail & Related papers (2024-07-24T16:48:32Z) - Efficiently Escaping Saddle Points for Non-Convex Policy Optimization [40.0986936439803]
Policy gradient (PG) is widely used in reinforcement learning due to its scalability and good performance.
We propose a variance-reduced second-order method that uses second-order information in the form of Hessian vector products (HVP) and converges to an approximate second-order stationary point (SOSP) with sample complexity of $tildeO(epsilon-3)$.
arXiv Detail & Related papers (2023-11-15T12:36:45Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - An Accelerated Stochastic Algorithm for Solving the Optimal Transport
Problem [2.1485350418225244]
A primal-dual accelerated gradient descent with variance reduction algorithm (PDASGD) is proposed to solve linear-constrained optimization problems.
PDASGD enjoys the best-known computational complexity -- $widetildemathcalO(n2/epsilon)$, where $n$ is the number of atoms, and $epsilon>0$ is the accuracy.
arXiv Detail & Related papers (2022-03-02T01:16:10Z) - A stochastic linearized proximal method of multipliers for convex
stochastic optimization with expectation constraints [8.133190610747974]
We present a computable approximation type algorithm, namely the linearized proximal convex method of multipliers.
Some preliminary numerical results demonstrate the performance of the proposed algorithm.
arXiv Detail & Related papers (2021-06-22T07:24:17Z) - Correcting Momentum with Second-order Information [50.992629498861724]
We develop a new algorithm for non-critical optimization that finds an $O(epsilon)$epsilon point in the optimal product.
We validate our results on a variety of large-scale deep learning benchmarks and architectures.
arXiv Detail & Related papers (2021-03-04T19:01:20Z) - Lagrangian Decomposition for Neural Network Verification [148.0448557991349]
A fundamental component of neural network verification is the computation of bounds on the values their outputs can take.
We propose a novel approach based on Lagrangian Decomposition.
We show that we obtain bounds comparable with off-the-shelf solvers in a fraction of their running time.
arXiv Detail & Related papers (2020-02-24T17:55:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.