Parametric models and information geometry on W*-algebras
- URL: http://arxiv.org/abs/2207.09396v1
- Date: Tue, 19 Jul 2022 16:44:54 GMT
- Title: Parametric models and information geometry on W*-algebras
- Authors: Florio M. Ciaglia, Fabio Di Nocera, J\"urgen Jost, Lorenz
Schwachh\"ofer
- Abstract summary: We introduce the notion of smooth parametric model of normal positive linear functionals on possibly infinite-dimensional W*-algebras.
We then use the Jordan product naturally available in this context to define a metric tensor on parametric models satsfying suitable regularity conditions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce the notion of smooth parametric model of normal positive linear
functionals on possibly infinite-dimensional W*-algebras generalizing the
notions of parametric models used in classical and quantum information
geometry. We then use the Jordan product naturally available in this context in
order to define a Riemannian metric tensor on parametric models satsfying
suitable regularity conditions. This Riemannian metric tensor reduces to the
Fisher-Rao metric tensor, or to the Fubini-Study metric tensor, or to the
Bures-Helstrom metric tensor when suitable choices for the W*-algebra and the
models are made.
Related papers
- Lie Algebra Canonicalization: Equivariant Neural Operators under arbitrary Lie Groups [11.572188414440436]
We propose Lie aLgebrA Canonicalization (LieLAC), a novel approach that exploits only the action of infinitesimal generators of the symmetry group.
operating within the framework of canonicalization, LieLAC can easily be integrated with unconstrained pre-trained models.
arXiv Detail & Related papers (2024-10-03T17:21:30Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - Log-density gradient covariance and automatic metric tensors for Riemann
manifold Monte Carlo methods [0.0]
The metric tensor is built from symmetric positive semidefinite log-density covariance gradient matrices.
The proposed methodology is highly automatic and allows for exploitation of any sparsity associated with the model in question.
arXiv Detail & Related papers (2022-11-03T12:22:20Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Operator-valued formulas for Riemannian Gradient and Hessian and
families of tractable metrics [0.0]
We provide a formula for a quotient of a manifold embedded in an inner product space with a non-constant metric function.
We extend the list of potential metrics that could be used in manifold optimization and machine learning.
arXiv Detail & Related papers (2020-09-21T20:15:57Z) - Riemannian optimization of isometric tensor networks [0.0]
We show how gradient-based optimization methods can be used to optimize tensor networks of isometries to represent e.g. ground states of 1D quantum Hamiltonians.
We apply these methods in the context of infinite MPS and MERA, and show benchmark results in which they outperform the best previously-known optimization methods.
arXiv Detail & Related papers (2020-07-07T17:19:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.