A Survey on Mixup Augmentations and Beyond
- URL: http://arxiv.org/abs/2409.05202v1
- Date: Sun, 8 Sep 2024 19:32:22 GMT
- Title: A Survey on Mixup Augmentations and Beyond
- Authors: Xin Jin, Hongyu Zhu, Siyuan Li, Zedong Wang, Zicheng Liu, Chang Yu, Huafeng Qin, Stan Z. Li,
- Abstract summary: Mixup and relevant data-mixing methods that convexly combine selected samples and the corresponding labels are widely adopted.
This survey presents a comprehensive review of foundational mixup methods and their applications.
- Score: 59.578288906956736
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As Deep Neural Networks have achieved thrilling breakthroughs in the past decade, data augmentations have garnered increasing attention as regularization techniques when massive labeled data are unavailable. Among existing augmentations, Mixup and relevant data-mixing methods that convexly combine selected samples and the corresponding labels are widely adopted because they yield high performances by generating data-dependent virtual data while easily migrating to various domains. This survey presents a comprehensive review of foundational mixup methods and their applications. We first elaborate on the training pipeline with mixup augmentations as a unified framework containing modules. A reformulated framework could contain various mixup methods and give intuitive operational procedures. Then, we systematically investigate the applications of mixup augmentations on vision downstream tasks, various data modalities, and some analysis \& theorems of mixup. Meanwhile, we conclude the current status and limitations of mixup research and point out further work for effective and efficient mixup augmentations. This survey can provide researchers with the current state of the art in mixup methods and provide some insights and guidance roles in the mixup arena. An online project with this survey is available at \url{https://github.com/Westlake-AI/Awesome-Mixup}.
Related papers
- ProxiMix: Enhancing Fairness with Proximity Samples in Subgroups [17.672299431705262]
Using linear mixup alone, a data augmentation technique, for bias mitigation, can still retain biases in dataset labels.
We propose a novel pre-processing strategy in which both an existing mixup method and our new bias mitigation algorithm can be utilized.
ProxiMix keeps both pairwise and proximity relationships for fairer data augmentation.
arXiv Detail & Related papers (2024-10-02T00:47:03Z) - PowMix: A Versatile Regularizer for Multimodal Sentiment Analysis [71.8946280170493]
This paper introduces PowMix, a versatile embedding space regularizer that builds upon the strengths of unimodal mixing-based regularization approaches.
PowMix is integrated before the fusion stage of multimodal architectures and facilitates intra-modal mixing, such as mixing text with text, to act as a regularizer.
arXiv Detail & Related papers (2023-12-19T17:01:58Z) - MiAMix: Enhancing Image Classification through a Multi-stage Augmented
Mixed Sample Data Augmentation Method [0.5919433278490629]
We introduce a novel mixup method called MiAMix, which stands for Multi-stage Augmented Mixup.
MiAMix integrates image augmentation into the mixup framework, utilizes multiple diversified mixing methods concurrently, and improves the mixing method by randomly selecting mixing mask augmentation methods.
Recent methods utilize saliency information and the MiAMix is designed for computational efficiency as well, reducing additional overhead and offering easy integration into existing training pipelines.
arXiv Detail & Related papers (2023-08-05T06:29:46Z) - MixupE: Understanding and Improving Mixup from Directional Derivative
Perspective [86.06981860668424]
We propose an improved version of Mixup, theoretically justified to deliver better generalization performance than the vanilla Mixup.
Our results show that the proposed method improves Mixup across multiple datasets using a variety of architectures.
arXiv Detail & Related papers (2022-12-27T07:03:52Z) - A Survey of Mix-based Data Augmentation: Taxonomy, Methods, Applications, and Explainability [29.40977854491399]
Data augmentation (DA) is indispensable in modern machine learning and deep neural networks.
This survey comprehensively reviews a crucial subset of DA techniques, namely Mix-based Data Augmentation (MixDA)
In contrast to traditional DA approaches that operate on single samples or entire datasets, MixDA stands out due to its effectiveness, simplicity, flexibility, computational efficiency, theoretical foundation, and broad applicability.
arXiv Detail & Related papers (2022-12-21T09:58:14Z) - Learning with MISELBO: The Mixture Cookbook [62.75516608080322]
We present the first ever mixture of variational approximations for a normalizing flow-based hierarchical variational autoencoder (VAE) with VampPrior and a PixelCNN decoder network.
We explain this cooperative behavior by drawing a novel connection between VI and adaptive importance sampling.
We obtain state-of-the-art results among VAE architectures in terms of negative log-likelihood on the MNIST and FashionMNIST datasets.
arXiv Detail & Related papers (2022-09-30T15:01:35Z) - OpenMixup: Open Mixup Toolbox and Benchmark for Visual Representation Learning [53.57075147367114]
We introduce OpenMixup, the first mixup augmentation and benchmark for visual representation learning.
We train 18 representative mixup baselines from scratch and rigorously evaluate them across 11 image datasets.
We also open-source our modular backbones, including a collection of popular vision backbones, optimization strategies, and analysis toolkits.
arXiv Detail & Related papers (2022-09-11T12:46:01Z) - RandoMix: A mixed sample data augmentation method with multiple mixed
modes [12.466162659083697]
RandoMix is a mixed-sample data augmentation method designed to address robustness and diversity challenges.
We evaluate the effectiveness of RandoMix on diverse datasets, including CIFAR-10/100, Tiny-ImageNet, ImageNet, and Google Speech Commands.
arXiv Detail & Related papers (2022-05-18T05:31:36Z) - MixAugment & Mixup: Augmentation Methods for Facial Expression
Recognition [4.273075747204267]
We propose a new data augmentation strategy which is based on Mixup, called MixAugment.
We conduct an extensive experimental study that proves the effectiveness of MixAugment over Mixup and various state-of-the-art methods.
arXiv Detail & Related papers (2022-05-09T17:43:08Z) - Harnessing Hard Mixed Samples with Decoupled Regularizer [69.98746081734441]
Mixup is an efficient data augmentation approach that improves the generalization of neural networks by smoothing the decision boundary with mixed data.
In this paper, we propose an efficient mixup objective function with a decoupled regularizer named Decoupled Mixup (DM)
DM can adaptively utilize hard mixed samples to mine discriminative features without losing the original smoothness of mixup.
arXiv Detail & Related papers (2022-03-21T07:12:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.