An Improved Uniform Convergence Bound with Fat-Shattering Dimension
- URL: http://arxiv.org/abs/2307.06644v1
- Date: Thu, 13 Jul 2023 09:20:57 GMT
- Title: An Improved Uniform Convergence Bound with Fat-Shattering Dimension
- Authors: Roberto Colomboni, Emmanuel Esposito, Andrea Paudice
- Abstract summary: The state-of-the-art upper bounds feature a multiplicative squared logarithmic factor on the sample complexity, leaving an open gap with the existing lower bound.
We provide an improved uniform convergence bound that closes this gap.
- Score: 4.5349436061325425
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The fat-shattering dimension characterizes the uniform convergence property
of real-valued functions. The state-of-the-art upper bounds feature a
multiplicative squared logarithmic factor on the sample complexity, leaving an
open gap with the existing lower bound. We provide an improved uniform
convergence bound that closes this gap.
Related papers
- Algorithmic Stability in Infinite Dimensions: Characterizing Unconditional Convergence in Banach Spaces [0.0]
A distinction between conditional, unconditional, and absolute convergence in infinite-dimensional spaces has fundamental implications for computational algorithms.<n>We present a comprehensive characterization theorem unifying seven equivalent conditions for unconditional convergence.<n>Our work bridges classical functional analysis with contemporary computational practice, providing rigorous foundations for order-independent and numerically robust processes.
arXiv Detail & Related papers (2026-01-13T12:51:58Z) - A Gapped Scale-Sensitive Dimension and Lower Bounds for Offset Rademacher Complexity [72.82374764881489]
We study gapped scale-sensitive dimensions of a function class in both sequential and non-sequential settings.<n>We show that the gapped dimensions lead to lower bounds on offset Rademacher averages.
arXiv Detail & Related papers (2025-09-24T23:49:53Z) - Exponential convergence rate for Iterative Markovian Fitting [58.760054965084656]
Iterative Markovian Fitting (IMF) algorithm converges in Kullback-Leibler divergence to the ground truth solution.<n>We establish for the first time that IMF exhibits exponential convergence with an explicit contraction factor.
arXiv Detail & Related papers (2025-08-04T09:48:21Z) - Entropic Mirror Descent for Linear Systems: Polyak's Stepsize and Implicit Bias [55.72269695392027]
This paper focuses on applying entropic mirror descent to solve linear systems.<n>The main challenge for the convergence analysis stems from the unboundedness of the domain.<n>To overcome this without imposing restrictive assumptions, we introduce a variant of Polyak-type stepsizes.
arXiv Detail & Related papers (2025-05-05T12:33:18Z) - Dimension-free uniform concentration bound for logistic regression [0.0]
We provide a novel-free uniform concentration bound for the dimension risk function of constrained logistic regression.
Our bound yields a milder sufficient condition for a uniform law of large numbers than conditions derived by the Rademacher complexity argument and McDiarmid's inequality.
arXiv Detail & Related papers (2024-05-28T11:09:39Z) - Subgradient Convergence Implies Subdifferential Convergence on Weakly Convex Functions: With Uniform Rates Guarantees [2.719510212909501]
In nonsmooth, non-average approximation optimization, understanding the uniform convergence of subdifferential mappings is crucial for analyzing stationary points of samples of risk.
This work connects the uniform convergence of subgradient mappings to the empirical convergence of subgradient mappings.
We derive uniform convergence rates for subdifferential convex-composites measured by the Hausdorff metric.
arXiv Detail & Related papers (2024-05-16T17:49:46Z) - Incremental Quasi-Newton Methods with Faster Superlinear Convergence
Rates [50.36933471975506]
We consider the finite-sum optimization problem, where each component function is strongly convex and has Lipschitz continuous gradient and Hessian.
The recently proposed incremental quasi-Newton method is based on BFGS update and achieves a local superlinear convergence rate.
This paper proposes a more efficient quasi-Newton method by incorporating the symmetric rank-1 update into the incremental framework.
arXiv Detail & Related papers (2024-02-04T05:54:51Z) - Comparing Comparators in Generalization Bounds [17.808906879967353]
We derive generic information-theoretic and PAC-Bayesian generalization bounds involving an arbitrary convex comparator function.
We show that the tightest possible bound is obtained with the comparator being the convex conjugate of the CGF of the bounding distribution.
This confirms the near-optimality of known bounds for bounded and sub-Gaussian losses and leads to novel bounds under other bounding distributions.
arXiv Detail & Related papers (2023-10-16T16:00:58Z) - Convex Bounds on the Softmax Function with Applications to Robustness
Verification [69.09991317119679]
The softmax function is a ubiquitous component at the output of neural networks and increasingly in intermediate layers as well.
This paper provides convex lower bounds and concave upper bounds on the softmax function, which are compatible with convex optimization formulations for characterizing neural networks and other ML models.
arXiv Detail & Related papers (2023-03-03T05:07:02Z) - Approximation of optimization problems with constraints through kernel
Sum-Of-Squares [77.27820145069515]
We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
arXiv Detail & Related papers (2023-01-16T10:30:04Z) - Polynomial convergence of iterations of certain random operators in
Hilbert space [2.732936573198251]
We study the convergence of random iterative sequence of a family of operators on infinite dimensional spaces inspired bySGD algorithm.
We demonstrate that its convergence rate depends on the initial state, while randomness plays a role only in the choice of the best constant factor.
arXiv Detail & Related papers (2022-02-04T17:48:29Z) - Conformal field theory from lattice fermions [77.34726150561087]
We provide a rigorous lattice approximation of conformal field theories given in terms of lattice fermions in 1+1-dimensions.
We show how these results lead to explicit error estimates pertaining to the quantum simulation of conformal field theories.
arXiv Detail & Related papers (2021-07-29T08:54:07Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - The Convergence Indicator: Improved and completely characterized
parameter bounds for actual convergence of Particle Swarm Optimization [68.8204255655161]
We introduce a new convergence indicator that can be used to calculate whether the particles will finally converge to a single point or diverge.
Using this convergence indicator we provide the actual bounds completely characterizing parameter regions that lead to a converging swarm.
arXiv Detail & Related papers (2020-06-06T19:08:05Z) - Variational-Correlations Approach to Quantum Many-body Problems [1.9336815376402714]
We investigate an approach for studying the ground state of a quantum many-body Hamiltonian.
The challenge set by the exponentially-large Hilbert space is circumvented by approximating the positivity of the density matrix.
We demonstrate the ability of this approach to produce long-range correlations, and a ground-state energy that converges to the exact result.
arXiv Detail & Related papers (2020-01-17T19:52:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.