Adaptive Step Sizes for Preconditioned Stochastic Gradient Descent
- URL: http://arxiv.org/abs/2311.16956v2
- Date: Wed, 18 Sep 2024 15:47:10 GMT
- Title: Adaptive Step Sizes for Preconditioned Stochastic Gradient Descent
- Authors: Frederik Köhne, Leonie Kreis, Anton Schiela, Roland Herzog,
- Abstract summary: This paper proposes a novel approach to adaptive step sizes in gradient descent (SGD)
We use quantities that we have identified as numerically traceable -- the Lipschitz constant for gradients and a concept of the local variance in search directions.
- Score: 0.3831327965422187
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This paper proposes a novel approach to adaptive step sizes in stochastic gradient descent (SGD) by utilizing quantities that we have identified as numerically traceable -- the Lipschitz constant for gradients and a concept of the local variance in search directions. Our findings yield a nearly hyperparameter-free algorithm for stochastic optimization, which has provable convergence properties and exhibits truly problem adaptive behavior on classical image classification tasks. Our framework is set in a general Hilbert space and thus enables the potential inclusion of a preconditioner through the choice of the inner product.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.