Typical perturbation theory: conditions, accuracy and comparison with a
mesoscopic case
- URL: http://arxiv.org/abs/2207.05502v2
- Date: Fri, 29 Jul 2022 09:15:19 GMT
- Title: Typical perturbation theory: conditions, accuracy and comparison with a
mesoscopic case
- Authors: Mats H. Lamann and Jochen Gemmer
- Abstract summary: The perturbation theory based on typicality introduced in Ref. [1] and further refined in Refs. [2, 3] is tested on three spin-based models.
The following criteria are taken into focus: the fulfillment of the conditions, the accuracy of the predicted dynamics and the relevance of the results with respect to a mesoscopic case.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The perturbation theory based on typicality introduced in Ref. [1] and
further refined in Refs. [2, 3] provides a powerful tool since it is intended
to be applicable to a wide range of scenarios while relying only on a few
parameters. Even though the authors present various examples to demonstrate the
effectiveness of the theory, the conditions used in its derivation are often
not thoroughly checked. It is argued that this is justified (without analytical
reasoning) by the robustness of the theory. In the paper at hand, said
perturbation theory is tested on three spin-based models. The following
criteria are taken into focus: the fulfillment of the conditions, the accuracy
of the predicted dynamics and the relevance of the results with respect to a
mesoscopic case.
Related papers
- An Effective Theory of Bias Amplification [18.648588509429167]
Machine learning models may capture and amplify biases present in data, leading to disparate test performance across social groups.
We propose a precise analytical theory in the context of ridge regression, where the former models neural networks in a simplified regime.
Our theory offers a unified and rigorous explanation of machine learning bias, providing insights into phenomena such as bias amplification and minority-group bias.
arXiv Detail & Related papers (2024-10-07T08:43:22Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Simultaneous inference for generalized linear models with unmeasured confounders [0.0]
We propose a unified statistical estimation and inference framework that harnesses structures and integrates linear projections into three key stages.
We show effective Type-I error control of $z$-tests as sample and response sizes approach infinity.
arXiv Detail & Related papers (2023-09-13T18:53:11Z) - Derivation of Standard Quantum Theory via State Discrimination [53.64687146666141]
General Probabilistic Theories (GPTs) is a new information theoretical approach to single out standard quantum theory.
We focus on the bound of the performance for an information task called state discrimination in general models.
We characterize standard quantum theory out of general models in GPTs by the bound of the performance for state discrimination.
arXiv Detail & Related papers (2023-07-21T00:02:11Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - Deep Grey-Box Modeling With Adaptive Data-Driven Models Toward
Trustworthy Estimation of Theory-Driven Models [88.63781315038824]
We present a framework that enables us to analyze a regularizer's behavior empirically with a slight change in the neural net's architecture and the training objective.
arXiv Detail & Related papers (2022-10-24T10:42:26Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Can convolutional ResNets approximately preserve input distances? A
frequency analysis perspective [31.897568775099558]
We show that the theoretical link between the regularisation scheme used and bi-Lipschitzness is only valid under conditions which do not hold in practice.
We present a simple constructive algorithm to search for counter examples to the distance preservation condition.
arXiv Detail & Related papers (2021-06-04T13:12:42Z) - General Probabilistic Theories with a Gleason-type Theorem [0.0]
Gleason-type theorems for quantum theory allow one to recover the quantum state space.
We identify the class of general probabilistic theories which also admit Gleason-type theorems.
arXiv Detail & Related papers (2020-05-28T17:29:29Z) - Marginal likelihood computation for model selection and hypothesis
testing: an extensive review [66.37504201165159]
This article provides a comprehensive study of the state-of-the-art of the topic.
We highlight limitations, benefits, connections and differences among the different techniques.
Problems and possible solutions with the use of improper priors are also described.
arXiv Detail & Related papers (2020-05-17T18:31:58Z) - Equivariant online predictions of non-stationary time series [0.0]
We analyze the theoretical predictive properties of statistical methods under model misspecification.
We show that a specific class of dynamic models -- random walk dynamic linear models -- produce exact minimax predictive densities.
arXiv Detail & Related papers (2019-11-20T01:46:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.