Computing the renormalization group flow of two-dimensional $\phi^4$
theory with tensor networks
- URL: http://arxiv.org/abs/2003.12993v1
- Date: Sun, 29 Mar 2020 10:31:26 GMT
- Title: Computing the renormalization group flow of two-dimensional $\phi^4$
theory with tensor networks
- Authors: Clement Delcamp, Antoine Tilloy
- Abstract summary: We study the renormalization group flow of $phi4$ theory in two dimensions.
Regularizing space into a fine-grained lattice and discretizing the scalar field in a controlled way, we rewrite the partition function of the theory as a tensor network.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the renormalization group flow of $\phi^4$ theory in two dimensions.
Regularizing space into a fine-grained lattice and discretizing the scalar
field in a controlled way, we rewrite the partition function of the theory as a
tensor network. Combining local truncations and a standard coarse-graining
scheme, we obtain the renormalization group flow of the theory as a map in a
space of tensors. Aside from qualitative insights, we verify the scaling
dimensions at criticality and extrapolate the critical coupling constant
$f_{\rm c} = \lambda / \mu ^2$ to the continuum to find $f^{\rm cont.}_{\rm c}
= 11.0861(90)$, which favorably compares with alternative methods.
Related papers
- Field digitization scaling in a $\mathbb{Z}_N \subset U(1)$ symmetric model [0.0]
We propose to analyze field digitization by interpreting the parameter $N$ as a coupling in the renormalization group sense.<n>Using effective field theory, we derive generalized scaling hypotheses involving the FD parameter $N$.<n>We analytically prove that our calculations for the 2D classical-statistical $mathbbZ_N$ clock model are directly related to the quantum physics in the ground state of a (2+1)D $mathbbZ_N$ lattice gauge theory.
arXiv Detail & Related papers (2025-07-30T18:00:02Z) - Spiral renormalization group flow and universal entanglement spectrum of the non-Hermitian 5-state Potts model [0.06597195879147556]
We show that tensor network algorithms are still capable of simulating non-Hermitian theories.<n>We reconstruct the full boundary CCFT spectrum through the entanglement Hamiltonian encoded in the ground state.
arXiv Detail & Related papers (2025-07-19T19:46:16Z) - Approximation of diffeomorphisms for quantum state transfers [49.1574468325115]
We seek to combine two emerging standpoints in control theory.<n>We numerically find control laws driving state transitions in small time in a bilinear Schr"odinger PDE posed on the torus.
arXiv Detail & Related papers (2025-03-18T17:28:59Z) - How DNNs break the Curse of Dimensionality: Compositionality and Symmetry Learning [9.302851743819339]
We show that deep neural networks (DNNs) can efficiently learn any composition of functions with bounded $F_1$-norm.
We compute scaling laws empirically and observe phase transitions depending on whether $g$ or $h$ is harder to learn.
arXiv Detail & Related papers (2024-07-08T06:59:29Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Renormalization group for measurement and entanglement phase transitions [0.0]
We analyze the renormalization-group (RG) flows of two effective Lagrangians.
We show that the theory for the random tensor network formally possesses a dimensional reduction property analogous to that of the random-field Ising model.
arXiv Detail & Related papers (2023-03-14T12:40:03Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - Symmetry-resolved entanglement entropy in critical free-fermion chains [0.0]
symmetry-resolved R'enyi entanglement entropy is known to have rich theoretical connections to conformal field theory.
We consider a class of critical quantum chains with a microscopic U(1) symmetry.
For the density matrix, $rho_A$, of subsystems of $L$ neighbouring sites we calculate the leading terms in the large $L$ expansion of the symmetry-resolved R'enyi entanglement entropies.
arXiv Detail & Related papers (2022-02-23T19:00:03Z) - Exponential Convergence of Deep Operator Networks for Elliptic Partial
Differential Equations [0.0]
We construct deep operator networks (ONets) between infinite-dimensional spaces that emulate with an exponential rate of convergence the coefficient-to-solution map of elliptic second-order PDEs.
In particular, we consider problems set in $d$-dimensional periodic domains, $d=1, 2, dots$, and with analytic right-hand sides and coefficients.
We prove that the neural networks in the ONet have size $mathcalO(left|log(varepsilon)right|kappa)$ for some $kappa
arXiv Detail & Related papers (2021-12-15T13:56:28Z) - Entanglement scaling for $\lambda\phi_2^4$ [0.0]
We show that the order parameter $phi$, the correlation length $xi$ and quantities like $phi3$ and the entanglement entropy exhibit useful double scaling properties.
We find the value $alpha_c=11.09698(31)$ for the critical point, improving on previous results.
arXiv Detail & Related papers (2021-04-21T14:43:12Z) - Estimating 2-Sinkhorn Divergence between Gaussian Processes from
Finite-Dimensional Marginals [4.416484585765028]
We study the convergence of estimating the 2-Sinkhorn divergence between emphGaussian processes (GPs) using their finite-dimensional marginal distributions.
We show almost sure convergence of the divergence when the marginals are sampled according to some base measure.
arXiv Detail & Related papers (2021-02-05T16:17:55Z) - Topological Quantum Gravity of the Ricci Flow [62.997667081978825]
We present a family of topological quantum gravity theories associated with the geometric theory of the Ricci flow.
First, we use BRST quantization to construct a "primitive" topological Lifshitz-type theory for only the spatial metric.
We extend the primitive theory by gauging foliation-preserving spacetime symmetries.
arXiv Detail & Related papers (2020-10-29T06:15:30Z) - Beyond Lazy Training for Over-parameterized Tensor Decomposition [69.4699995828506]
We show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
Our results show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
arXiv Detail & Related papers (2020-10-22T00:32:12Z) - A deep network construction that adapts to intrinsic dimensionality
beyond the domain [79.23797234241471]
We study the approximation of two-layer compositions $f(x) = g(phi(x))$ via deep networks with ReLU activation.
We focus on two intuitive and practically relevant choices for $phi$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets.
arXiv Detail & Related papers (2020-08-06T09:50:29Z) - Linear Time Sinkhorn Divergences using Positive Features [51.50788603386766]
Solving optimal transport with an entropic regularization requires computing a $ntimes n$ kernel matrix that is repeatedly applied to a vector.
We propose to use instead ground costs of the form $c(x,y)=-logdotpvarphi(x)varphi(y)$ where $varphi$ is a map from the ground space onto the positive orthant $RRr_+$, with $rll n$.
arXiv Detail & Related papers (2020-06-12T10:21:40Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.