NASS: Optimizing Secure Inference via Neural Architecture Search
- URL: http://arxiv.org/abs/2001.11854v3
- Date: Sun, 16 Feb 2020 10:54:21 GMT
- Title: NASS: Optimizing Secure Inference via Neural Architecture Search
- Authors: Song Bian, Weiwen Jiang, Qing Lu, Yiyu Shi, Takashi Sato
- Abstract summary: We propose NASS, an integrated framework to search for tailored NN architectures designed specifically for secure inference (SI)
We show that we can achieve the best of both worlds by using NASS, where the prediction accuracy can be improved from 81.6% to 84.6%, while the inference runtime is reduced by 2x and communication bandwidth by 1.9x on the CIFAR-10 dataset.
- Score: 21.72469549507192
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to increasing privacy concerns, neural network (NN) based secure
inference (SI) schemes that simultaneously hide the client inputs and server
models attract major research interests. While existing works focused on
developing secure protocols for NN-based SI, in this work, we take a different
approach. We propose NASS, an integrated framework to search for tailored NN
architectures designed specifically for SI. In particular, we propose to model
cryptographic protocols as design elements with associated reward functions.
The characterized models are then adopted in a joint optimization with
predicted hyperparameters in identifying the best NN architectures that balance
prediction accuracy and execution efficiency. In the experiment, it is
demonstrated that we can achieve the best of both worlds by using NASS, where
the prediction accuracy can be improved from 81.6% to 84.6%, while the
inference runtime is reduced by 2x and communication bandwidth by 1.9x on the
CIFAR-10 dataset.
Related papers
- Neural Architecture Search using Particle Swarm and Ant Colony
Optimization [0.0]
This paper focuses on training and optimizing CNNs using the Swarm Intelligence (SI) components of OpenNAS.
A system integrating open source tools for Neural Architecture Search (OpenNAS), in the classification of images, has been developed.
arXiv Detail & Related papers (2024-03-06T15:23:26Z) - SONATA: Self-adaptive Evolutionary Framework for Hardware-aware Neural
Architecture Search [0.7646713951724011]
HW-aware Neural Architecture Search (HW-aware NAS) emerges as an attractive strategy to automate the design of NN.
We propose SONATA, a self-adaptive evolutionary algorithm for HW-aware NAS.
Our method leverages adaptive evolutionary operators guided by the learned importance of NN design parameters.
arXiv Detail & Related papers (2024-02-20T18:15:11Z) - LitE-SNN: Designing Lightweight and Efficient Spiking Neural Network through Spatial-Temporal Compressive Network Search and Joint Optimization [48.41286573672824]
Spiking Neural Networks (SNNs) mimic the information-processing mechanisms of the human brain and are highly energy-efficient.
We propose a new approach named LitE-SNN that incorporates both spatial and temporal compression into the automated network design process.
arXiv Detail & Related papers (2024-01-26T05:23:11Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models [56.584561770857306]
We propose a novel conditional Neural Architecture Generation (NAG) framework based on diffusion models, dubbed DiffusionNAG.
Specifically, we consider the neural architectures as directed graphs and propose a graph diffusion model for generating them.
We validate the effectiveness of DiffusionNAG through extensive experiments in two predictor-based NAS scenarios: Transferable NAS and Bayesian Optimization (BO)-based NAS.
When integrated into a BO-based algorithm, DiffusionNAG outperforms existing BO-based NAS approaches, particularly in the large MobileNetV3 search space on the ImageNet 1K dataset.
arXiv Detail & Related papers (2023-05-26T13:58:18Z) - HQNAS: Auto CNN deployment framework for joint quantization and
architecture search [30.45926484863791]
We propose a novel neural network design framework called Hardware-aware Quantized Neural Architecture Search(HQNAS)
It takes only 4 GPU hours to discover an outstanding NN policy on CIFAR10.
It also takes only %10 GPU time to generate a comparable model on Imagenet.
arXiv Detail & Related papers (2022-10-16T08:32:18Z) - RoHNAS: A Neural Architecture Search Framework with Conjoint
Optimization for Adversarial Robustness and Hardware Efficiency of
Convolutional and Capsule Networks [10.946374356026679]
RoHNAS is a novel framework that jointly optimize for adversarial-robustness and hardware-efficiency of Deep Neural Network (DNN)
For reducing the exploration time, RoHNAS analyzes and selects appropriate values of adversarial perturbation for each dataset to employ in the NAS flow.
arXiv Detail & Related papers (2022-10-11T09:14:56Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - BRP-NAS: Prediction-based NAS using GCNs [21.765796576990137]
BRP-NAS is an efficient hardware-aware NAS enabled by an accurate performance predictor-based on graph convolutional network (GCN)
We show that our proposed method outperforms all prior methods on NAS-Bench-101 and NAS-Bench-201.
We also release LatBench -- a latency dataset of NAS-Bench-201 models running on a broad range of devices.
arXiv Detail & Related papers (2020-07-16T21:58:43Z) - Accuracy Prediction with Non-neural Model for Neural Architecture Search [185.0651567642238]
We study an alternative approach which uses non-neural model for accuracy prediction.
We leverage gradient boosting decision tree (GBDT) as the predictor for Neural architecture search (NAS)
Experiments on NASBench-101 and ImageNet demonstrate the effectiveness of using GBDT as predictor for NAS.
arXiv Detail & Related papers (2020-07-09T13:28:49Z) - Hyperparameter Optimization in Neural Networks via Structured Sparse
Recovery [54.60327265077322]
We study two important problems in the automated design of neural networks through the lens of sparse recovery methods.
In the first part of this paper, we establish a novel connection between HPO and structured sparse recovery.
In the second part of this paper, we establish a connection between NAS and structured sparse recovery.
arXiv Detail & Related papers (2020-07-07T00:57:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.