On the Privacy Risks of Cell-Based NAS Architectures
- URL: http://arxiv.org/abs/2209.01688v1
- Date: Sun, 4 Sep 2022 20:24:04 GMT
- Title: On the Privacy Risks of Cell-Based NAS Architectures
- Authors: Hai Huang, Zhikun Zhang, Yun Shen, Michael Backes, Qi Li, Yang Zhang
- Abstract summary: We systematically measure the privacy risks of NAS architectures.
We shed light on how to design robust NAS architectures against privacy attacks.
We offer a general methodology to understand the hidden correlation between the NAS-searched architectures and other privacy risks.
- Score: 28.71028000150282
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Existing studies on neural architecture search (NAS) mainly focus on
efficiently and effectively searching for network architectures with better
performance. Little progress has been made to systematically understand if the
NAS-searched architectures are robust to privacy attacks while abundant work
has already shown that human-designed architectures are prone to privacy
attacks. In this paper, we fill this gap and systematically measure the privacy
risks of NAS architectures. Leveraging the insights from our measurement study,
we further explore the cell patterns of cell-based NAS architectures and
evaluate how the cell patterns affect the privacy risks of NAS-searched
architectures. Through extensive experiments, we shed light on how to design
robust NAS architectures against privacy attacks, and also offer a general
methodology to understand the hidden correlation between the NAS-searched
architectures and other privacy risks.
Related papers
- Generalizable Lightweight Proxy for Robust NAS against Diverse
Perturbations [59.683234126055694]
Recent neural architecture search (NAS) frameworks have been successful in finding optimal architectures for given conditions.
We propose a novel lightweight robust zero-cost proxy that considers the consistency across features, parameters, and gradients of both clean and perturbed images.
Our approach facilitates an efficient and rapid search for neural architectures capable of learning generalizable features that exhibit robustness across diverse perturbations.
arXiv Detail & Related papers (2023-06-08T08:34:26Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - On Redundancy and Diversity in Cell-based Neural Architecture Search [44.337381243798085]
We conduct an empirical analysis of architectures from the popular cell-based search spaces.
We find that the architecture performance is minimally sensitive to changes at large parts of the cells.
By explicitly constraining cells to include these patterns, randomly sampled architectures can match or even outperform the state of the art.
arXiv Detail & Related papers (2022-03-16T18:59:29Z) - Towards Less Constrained Macro-Neural Architecture Search [2.685668802278155]
Neural Architecture Search (NAS) networks achieve state-of-the-art performance in a variety of tasks.
Most NAS methods rely heavily on human-defined assumptions that constrain the search.
We present experiments showing that LCMNAS generates state-of-the-art architectures from scratch with minimal GPU computation.
arXiv Detail & Related papers (2022-03-10T17:53:03Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - Memory-Efficient Hierarchical Neural Architecture Search for Image
Restoration [68.6505473346005]
We propose a memory-efficient hierarchical NAS HiNAS (HiNAS) for image denoising and image super-resolution tasks.
With a single GTX1080Ti GPU, it takes only about 1 hour for searching for denoising network on BSD 500 and 3.5 hours for searching for the super-resolution structure on DIV2K.
arXiv Detail & Related papers (2020-12-24T12:06:17Z) - Neural Network Design: Learning from Neural Architecture Search [3.9430294028981763]
Neural Architecture Search (NAS) aims to optimize deep neural networks' architecture for better accuracy or smaller computational cost.
Despite various successful approaches proposed to solve the NAS task, the landscape of it, along with its properties, are rarely investigated.
arXiv Detail & Related papers (2020-11-01T15:02:02Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z) - Learning Architectures from an Extended Search Space for Language
Modeling [37.79977691127229]
We present a general approach to learn both intra-cell and inter-cell architectures of Neural architecture search (NAS)
For recurrent neural language modeling, it outperforms a strong baseline significantly on the PTB and WikiText data, with a new state-of-the-art on PTB.
The learned architectures show good transferability to other systems.
arXiv Detail & Related papers (2020-05-06T05:02:33Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.