Lecture Notes on Normalizing Flows for Lattice Quantum Field Theories
- URL: http://arxiv.org/abs/2504.18126v1
- Date: Fri, 25 Apr 2025 07:22:11 GMT
- Title: Lecture Notes on Normalizing Flows for Lattice Quantum Field Theories
- Authors: Miranda C. N. Cheng, Niki Stratikopoulou,
- Abstract summary: Notes aim to give a brief account of lattice field theories, normalizing flows, and how the latter can be applied to study the former.<n>The notes are based on the lectures given by the first author in various recent research schools.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Numerical simulations of quantum field theories on lattices serve as a fundamental tool for studying the non-perturbative regime of the theories, where analytic tools often fall short. Challenges arise when one takes the continuum limit or as the system approaches a critical point, especially in the presence of non-trivial topological structures in the theory. Rapid recent advances in machine learning provide a promising avenue for progress in this area. These lecture notes aim to give a brief account of lattice field theories, normalizing flows, and how the latter can be applied to study the former. The notes are based on the lectures given by the first author in various recent research schools.
Related papers
- Sheaf theory: from deep geometry to deep learning [0.3749861135832073]
This paper provides an overview of the applications of sheaf theory in deep learning, data science, and computer science.<n>We describe intuitions and motivations underlying sheaf theory shared by both theoretical researchers and practitioners.<n>We present a new algorithm to compute sheaf cohomology on arbitrary finite posets in response.
arXiv Detail & Related papers (2025-02-21T14:00:25Z) - Foundations and Frontiers of Graph Learning Theory [81.39078977407719]
Recent advancements in graph learning have revolutionized the way to understand and analyze data with complex structures.
Graph Neural Networks (GNNs), i.e. neural network architectures designed for learning graph representations, have become a popular paradigm.
This article provides a comprehensive summary of the theoretical foundations and breakthroughs concerning the approximation and learning behaviors intrinsic to prevalent graph learning models.
arXiv Detail & Related papers (2024-07-03T14:07:41Z) - Deep learning lattice gauge theories [0.0]
We use neural network quantum states to accurately compute the ground state of lattice gauge theories in $2+1$ dimensions.
Our findings suggest that neural network quantum states are a promising method for precise studies of lattice gauge theory.
arXiv Detail & Related papers (2024-05-23T17:46:49Z) - Relaxation of first-class constraints and the quantization of gauge theories: from "matter without matter" to the reappearance of time in quantum gravity [72.27323884094953]
We make a conceptual overview of an approach to the initial-value problem in canonical gauge theories.
We stress how the first-class phase-space constraints may be relaxed if we interpret them as fixing the values of new degrees of freedom.
arXiv Detail & Related papers (2024-02-19T19:00:02Z) - A simple theory for quantum quenches in the ANNNI model [0.0]
Signatures of proximate quantum critical points can be observed at early and intermediate times after certain quantum quenches.
We construct a time-dependent mean-field theory that allows us to obtain a quantitatively accurate description of these quenches at short times.
arXiv Detail & Related papers (2023-01-10T16:47:26Z) - Holographic tensor network models and quantum error correction: A
topical review [78.28647825246472]
Recent progress in studies of holographic dualities has led to a confluence with concepts and techniques from quantum information theory.
A particularly successful approach has involved capturing holographic properties by means of tensor networks.
arXiv Detail & Related papers (2021-02-04T14:09:21Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z) - On the Enumerative Structures in Quantum Field Theory [0.0]
Chord diagrams appear in quantum field theory in the context of Dyson-Schwinger equations.
In another direction, we study the action of point field diffeomorphisms on a free theory.
arXiv Detail & Related papers (2020-08-26T16:38:20Z) - A Chain Graph Interpretation of Real-World Neural Networks [58.78692706974121]
We propose an alternative interpretation that identifies NNs as chain graphs (CGs) and feed-forward as an approximate inference procedure.
The CG interpretation specifies the nature of each NN component within the rich theoretical framework of probabilistic graphical models.
We demonstrate with concrete examples that the CG interpretation can provide novel theoretical support and insights for various NN techniques.
arXiv Detail & Related papers (2020-06-30T14:46:08Z) - From a quantum theory to a classical one [117.44028458220427]
We present and discuss a formal approach for describing the quantum to classical crossover.
The method was originally introduced by L. Yaffe in 1982 for tackling large-$N$ quantum field theories.
arXiv Detail & Related papers (2020-04-01T09:16:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.