Towards a Robust Differentiable Architecture Search under Label Noise
- URL: http://arxiv.org/abs/2110.12197v1
- Date: Sat, 23 Oct 2021 11:31:06 GMT
- Title: Towards a Robust Differentiable Architecture Search under Label Noise
- Authors: Christian Simon, Piotr Koniusz, Lars Petersson, Yan Han, Mehrtash
Harandi
- Abstract summary: We show that vanilla NAS algorithms suffer from a performance loss if class labels are noisy.
Our empirical evaluations show that the noise injecting operation does not degrade the performance of the NAS algorithm if the data is indeed clean.
- Score: 44.86506257979988
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) is the game changer in designing robust
neural architectures. Architectures designed by NAS outperform or compete with
the best manual network designs in terms of accuracy, size, memory footprint
and FLOPs. That said, previous studies focus on developing NAS algorithms for
clean high quality data, a restrictive and somewhat unrealistic assumption. In
this paper, focusing on the differentiable NAS algorithms, we show that vanilla
NAS algorithms suffer from a performance loss if class labels are noisy. To
combat this issue, we make use of the principle of information bottleneck as a
regularizer. This leads us to develop a noise injecting operation that is
included during the learning process, preventing the network from learning from
noisy samples. Our empirical evaluations show that the noise injecting
operation does not degrade the performance of the NAS algorithm if the data is
indeed clean. In contrast, if the data is noisy, the architecture learned by
our algorithm comfortably outperforms algorithms specifically equipped with
sophisticated mechanisms to learn in the presence of label noise. In contrast
to many algorithms designed to work in the presence of noisy labels, prior
knowledge about the properties of the noise and its characteristics are not
required for our algorithm.
Related papers
- Delta-NAS: Difference of Architecture Encoding for Predictor-based Evolutionary Neural Architecture Search [5.1331676121360985]
We craft an algorithm with the capability to perform fine-grain NAS at a low cost.
We propose projecting the problem to a lower dimensional space through predicting the difference in accuracy of a pair of similar networks.
arXiv Detail & Related papers (2024-11-21T02:43:32Z) - NASiam: Efficient Representation Learning using Neural Architecture
Search for Siamese Networks [76.8112416450677]
Siamese networks are one of the most trending methods to achieve self-supervised visual representation learning (SSL)
NASiam is a novel approach that uses for the first time differentiable NAS to improve the multilayer perceptron projector and predictor (encoder/predictor pair)
NASiam reaches competitive performance in both small-scale (i.e., CIFAR-10/CIFAR-100) and large-scale (i.e., ImageNet) image classification datasets while costing only a few GPU hours.
arXiv Detail & Related papers (2023-01-31T19:48:37Z) - Generalization Properties of NAS under Activation and Skip Connection
Search [66.8386847112332]
We study the generalization properties of Neural Architecture Search (NAS) under a unifying framework.
We derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime.
We show how the derived results can guide NAS to select the top-performing architectures, even in the case without training.
arXiv Detail & Related papers (2022-09-15T12:11:41Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - Heed the Noise in Performance Evaluations in Neural Architecture Search [0.0]
Neural Architecture Search (NAS) has recently become a topic of great interest.
There is a potentially impactful issue within NAS that remains largely unrecognized: noise.
We propose to reduce the noise by having architecture evaluations averaging of scores over multiple network training runs.
arXiv Detail & Related papers (2022-02-04T11:20:46Z) - NAS-FCOS: Efficient Search for Object Detection Architectures [113.47766862146389]
We propose an efficient method to obtain better object detectors by searching for the feature pyramid network (FPN) and the prediction head of a simple anchor-free object detector.
With carefully designed search space, search algorithms, and strategies for evaluating network quality, we are able to find top-performing detection architectures within 4 days using 8 V100 GPUs.
arXiv Detail & Related papers (2021-10-24T12:20:04Z) - Generic Neural Architecture Search via Regression [27.78105839644199]
We propose a novel and generic neural architecture search (NAS) framework, termed Generic NAS (GenNAS)
GenNAS does not use task-specific labels but instead adopts textitregression on a set of manually designed synthetic signal bases for architecture evaluation.
We then propose an automatic task search to optimize the combination of synthetic signals using limited downstream-task-specific labels.
arXiv Detail & Related papers (2021-08-04T08:21:12Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - Exploring the Loss Landscape in Neural Architecture Search [15.830099254570959]
We show that the simplest hill-climbing algorithm is a powerful baseline for NAS.
We also show that the number of local minima is substantially reduced as the noise decreases.
arXiv Detail & Related papers (2020-05-06T17:09:16Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.