Fractal Dimension Generalization Measure
- URL: http://arxiv.org/abs/2012.12384v1
- Date: Tue, 22 Dec 2020 22:04:32 GMT
- Title: Fractal Dimension Generalization Measure
- Authors: Valeri Alexiev
- Abstract summary: This paper is part of the "Predicting Generalization in Deep Learning" competition.
We analyse the complexity of decision boundaries using the concept of fractal dimension and develop a generalization measure based on that technique.
- Score: 0.2635832975589208
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Developing a robust generalization measure for the performance of machine
learning models is an important and challenging task. A lot of recent research
in the area focuses on the model decision boundary when predicting
generalization. In this paper, as part of the "Predicting Generalization in
Deep Learning" competition, we analyse the complexity of decision boundaries
using the concept of fractal dimension and develop a generalization measure
based on that technique.
Related papers
- On the Limitations of Fractal Dimension as a Measure of Generalization [18.257634786946397]
We show that fractal dimension fails to predict generalization of models trained from poor initializations.
We also show that the $ell2$ norm of the final parameter iterate, one of the simplest complexity measures in learning theory, correlates more strongly with the generalization gap than these notions of fractal dimension.
This work lays the ground for a deeper investigation of the causal relationships between fractal geometry, topological data analysis, and neural network optimization.
arXiv Detail & Related papers (2024-06-04T11:56:19Z) - Consciousness-Inspired Spatio-Temporal Abstractions for Better Generalization in Reinforcement Learning [83.41487567765871]
Skipper is a model-based reinforcement learning framework.
It automatically generalizes the task given into smaller, more manageable subtasks.
It enables sparse decision-making and focused abstractions on the relevant parts of the environment.
arXiv Detail & Related papers (2023-09-30T02:25:18Z) - PAC-Bayes Compression Bounds So Tight That They Can Explain
Generalization [48.26492774959634]
We develop a compression approach based on quantizing neural network parameters in a linear subspace.
We find large models can be compressed to a much greater extent than previously known, encapsulating Occam's razor.
arXiv Detail & Related papers (2022-11-24T13:50:16Z) - The Two Dimensions of Worst-case Training and the Integrated Effect for
Out-of-domain Generalization [95.34898583368154]
We propose a new, simple yet effective, generalization to train machine learning models.
We name our method W2D following the concept of "Worst-case along Two Dimensions"
arXiv Detail & Related papers (2022-04-09T04:14:55Z) - Towards Principled Disentanglement for Domain Generalization [90.9891372499545]
A fundamental challenge for machine learning models is generalizing to out-of-distribution (OOD) data.
We first formalize the OOD generalization problem as constrained optimization, called Disentanglement-constrained Domain Generalization (DDG)
Based on the transformation, we propose a primal-dual algorithm for joint representation disentanglement and domain generalization.
arXiv Detail & Related papers (2021-11-27T07:36:32Z) - Generalization Bounds For Meta-Learning: An Information-Theoretic
Analysis [8.028776552383365]
We propose a generic understanding of both the conventional learning-to-learn framework and the modern model-agnostic meta-learning algorithms.
We provide a data-dependent generalization bound for a variant of MAML, which is non-vacuous for deep few-shot learning.
arXiv Detail & Related papers (2021-09-29T17:45:54Z) - Evaluation of Complexity Measures for Deep Learning Generalization in
Medical Image Analysis [77.34726150561087]
PAC-Bayes flatness-based and path norm-based measures produce the most consistent explanation for the combination of models and data.
We also investigate the use of multi-task classification and segmentation approach for breast images.
arXiv Detail & Related papers (2021-03-04T20:58:22Z) - Robustness to Augmentations as a Generalization metric [0.0]
Generalization is the ability of a model to predict on unseen domains.
We propose a method to predict the generalization performance of a model by using the concept that models that are robust to augmentations are more generalizable than those which are not.
The proposed method was the first runner up solution for the NeurIPS competition on Predicting Generalization in Deep Learning.
arXiv Detail & Related papers (2021-01-16T15:36:38Z) - Predicting Generalization in Deep Learning via Local Measures of
Distortion [7.806155368334511]
We study generalization in deep learning by appealing to complexity measures originally developed in approximation and information theory.
We show that simple vector quantization approaches such as PCA, GMMs, and SVMs capture their spirit when applied layer-wise to deep extracted features.
arXiv Detail & Related papers (2020-12-13T05:46:46Z) - Representation Based Complexity Measures for Predicting Generalization
in Deep Learning [0.0]
Deep Neural Networks can generalize despite being significantly overparametrized.
Recent research has tried to examine this phenomenon from various view points.
We provide an interpretation of generalization from the perspective of quality of internal representations.
arXiv Detail & Related papers (2020-12-04T18:53:44Z) - In Search of Robust Measures of Generalization [79.75709926309703]
We develop bounds on generalization error, optimization error, and excess risk.
When evaluated empirically, most of these bounds are numerically vacuous.
We argue that generalization measures should instead be evaluated within the framework of distributional robustness.
arXiv Detail & Related papers (2020-10-22T17:54:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.