Statistics of Green's functions on a disordered Cayley tree and the
validity of forward scattering approximation
- URL: http://arxiv.org/abs/2108.10326v2
- Date: Wed, 3 Nov 2021 16:42:10 GMT
- Title: Statistics of Green's functions on a disordered Cayley tree and the
validity of forward scattering approximation
- Authors: P. A. Nosov, I. M. Khaymovich, A. Kudlis and V. E. Kravtsov
- Abstract summary: The accuracy of the forward scattering approximation for two-point Green's functions of the Anderson localization model on the Cayley tree is studied.
A relationship between the moments of the Green's function and the largest eigenvalue of the linearized transfer-matrix equation is proved.
The new large-disorder approximation for this eigenvalue is derived and its accuracy is established.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The accuracy of the forward scattering approximation for two-point Green's
functions of the Anderson localization model on the Cayley tree is studied. A
relationship between the moments of the Green's function and the largest
eigenvalue of the linearized transfer-matrix equation is proved in the
framework of the supersymmetric functional-integral method. The new
large-disorder approximation for this eigenvalue is derived and its accuracy is
established. Using this approximation the probability distribution of the
two-point Green's function is found and compared with that in the forward
scattering approximation (FSA). It is shown that FSA overestimates the role of
resonances and thus the probability for the Green's function to be
significantly larger than its typical value. The error of FSA increases with
increasing the distance between points in a two-point Green's function.
Related papers
- Statistical Inference for Temporal Difference Learning with Linear Function Approximation [62.69448336714418]
Temporal Difference (TD) learning, arguably the most widely used for policy evaluation, serves as a natural framework for this purpose.
In this paper, we study the consistency properties of TD learning with Polyak-Ruppert averaging and linear function approximation, and obtain three significant improvements over existing results.
arXiv Detail & Related papers (2024-10-21T15:34:44Z) - Amortized SHAP values via sparse Fourier function approximation [38.818224762845624]
SHAP values are a popular local feature-attribution method widely used in interpretable and explainable AI.
We propose a two-stage approach for estimating SHAP values.
Our algorithm's first step harnesses recent results showing that many real-world predictors have a spectral bias.
arXiv Detail & Related papers (2024-10-08T19:05:50Z) - Statistical Inference of Optimal Allocations I: Regularities and their Implications [3.904240476752459]
We first derive Hadamard differentiability of the value function through a detailed analysis of the general properties of the sorting operator.
Building on our Hadamard differentiability results, we demonstrate how the functional delta method can be used to directly derive the properties of the value function process.
arXiv Detail & Related papers (2024-03-27T04:39:13Z) - Learning Domain-Independent Green's Function For Elliptic Partial
Differential Equations [0.0]
Green's function characterizes a partial differential equation (PDE) and maps its solution in the entire domain as integrals.
We propose a novel boundary integral network to learn the domain-independent Green's function, referred to as BIN-G.
We demonstrate that our numerical scheme enables fast training and accurate evaluation of the Green's function for PDEs with variable coefficients.
arXiv Detail & Related papers (2024-01-30T17:00:22Z) - Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Data-Driven Influence Functions for Optimization-Based Causal Inference [105.5385525290466]
We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite differencing.
We study the case where probability distributions are not known a priori but need to be estimated from data.
arXiv Detail & Related papers (2022-08-29T16:16:22Z) - A Novel Approach to Radiometric Identification [68.8204255655161]
This paper demonstrates that highly accurate radiometric identification is possible using CAPoNeF feature engineering method.
We tested basic ML classification algorithms on experimental data gathered by SDR.
arXiv Detail & Related papers (2020-12-02T10:54:44Z) - Optimal Bounds between $f$-Divergences and Integral Probability Metrics [8.401473551081748]
Families of $f$-divergences and Integral Probability Metrics are widely used to quantify similarity between probability distributions.
We systematically study the relationship between these two families from the perspective of convex duality.
We obtain new bounds while also recovering in a unified manner well-known results, such as Hoeffding's lemma.
arXiv Detail & Related papers (2020-06-10T17:39:11Z) - The covariance matrix of Green's functions and its application to
machine learning [0.0]
We first survey Green's function for the Dirichlet boundary value problem of 2nd order linear ordinary differential equation.
We consider a covariance matrix composed of the normalized Green's function, which is regarded as aprobability density function.
arXiv Detail & Related papers (2020-04-14T13:26:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.