Mixup Augmentation with Multiple Interpolations
- URL: http://arxiv.org/abs/2406.01417v1
- Date: Mon, 3 Jun 2024 15:16:09 GMT
- Title: Mixup Augmentation with Multiple Interpolations
- Authors: Lifeng Shen, Jincheng Yu, Hansi Yang, James T. Kwok,
- Abstract summary: We propose a simple yet effective extension called multi-mix, which generates multiple gradients from a sample pair.
With an ordered sequence of generated samples, multi-mix can better guide the training process than standard mixup.
- Score: 26.46413903248954
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mixup and its variants form a popular class of data augmentation techniques.Using a random sample pair, it generates a new sample by linear interpolation of the inputs and labels. However, generating only one single interpolation may limit its augmentation ability. In this paper, we propose a simple yet effective extension called multi-mix, which generates multiple interpolations from a sample pair. With an ordered sequence of generated samples, multi-mix can better guide the training process than standard mixup. Moreover, theoretically, this can also reduce the stochastic gradient variance. Extensive experiments on a number of synthetic and large-scale data sets demonstrate that multi-mix outperforms various mixup variants and non-mixup-based baselines in terms of generalization, robustness, and calibration.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.