Effects of Exponential Gaussian Distribution on (Double Sampling) Randomized Smoothing
- URL: http://arxiv.org/abs/2406.02309v2
- Date: Wed, 5 Jun 2024 12:10:16 GMT
- Title: Effects of Exponential Gaussian Distribution on (Double Sampling) Randomized Smoothing
- Authors: Youwei Shu, Xi Xiao, Derui Wang, Yuxin Cao, Siji Chen, Jason Xue, Linyi Li, Bo Li,
- Abstract summary: We study the effect of two families of distributions, named Exponential Standard Gaussian (ESG) and Exponential General Gaussian (EGG) distributions, on Randomized Smoothing and Double Randomized Smoothing (DSRS)
Our experiments on real-world datasets confirm our theoretical analysis of the ESG, that they provide almost the same certification under different exponents $eta$ for both RS and DSRS.
Compared to the primitive DSRS, the increase in certified accuracy provided by EGG is prominent, up to 6.4% on ImageNet.
- Score: 21.618349628349115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Randomized Smoothing (RS) is currently a scalable certified defense method providing robustness certification against adversarial examples. Although significant progress has been achieved in providing defenses against $\ell_p$ adversaries, the interaction between the smoothing distribution and the robustness certification still remains vague. In this work, we comprehensively study the effect of two families of distributions, named Exponential Standard Gaussian (ESG) and Exponential General Gaussian (EGG) distributions, on Randomized Smoothing and Double Sampling Randomized Smoothing (DSRS). We derive an analytic formula for ESG's certified radius, which converges to the origin formula of RS as the dimension $d$ increases. Additionally, we prove that EGG can provide tighter constant factors than DSRS in providing $\Omega(\sqrt{d})$ lower bounds of $\ell_2$ certified radius, and thus further addresses the curse of dimensionality in RS. Our experiments on real-world datasets confirm our theoretical analysis of the ESG distributions, that they provide almost the same certification under different exponents $\eta$ for both RS and DSRS. In addition, EGG brings a significant improvement to the DSRS certification, but the mechanism can be different when the classifier properties are different. Compared to the primitive DSRS, the increase in certified accuracy provided by EGG is prominent, up to 6.4% on ImageNet.
Related papers
- Theoretical Convergence Guarantees for Variational Autoencoders [2.8167997311962942]
Variational Autoencoders (VAE) are popular generative models used to sample from complex data distributions.
This paper aims to bridge that gap by providing non-asymptotic convergence guarantees for VAE trained using both Gradient Descent and Adam algorithms.
Our theoretical analysis applies to both Linear VAE and Deep Gaussian VAE, as well as several VAE variants, including $beta$-VAE and IWAE.
arXiv Detail & Related papers (2024-10-22T07:12:38Z) - Mitigating the Curse of Dimensionality for Certified Robustness via Dual Randomized Smoothing [48.219725131912355]
This paper explores the feasibility of providing $ell$ certified robustness for high-dimensional input through the utilization of dual smoothing.
The proposed Dual Smoothing (DRS) down-samples the input image into two sub-images and smooths the two sub-images in lower dimensions.
Extensive experiments demonstrate the generalizability and effectiveness of DRS, which exhibits a notable capability to integrate with established methodologies.
arXiv Detail & Related papers (2024-04-15T08:54:33Z) - Soft Random Sampling: A Theoretical and Empirical Analysis [59.719035355483875]
Soft random sampling (SRS) is a simple yet effective approach for efficient deep neural networks when dealing with massive data.
It selects a uniformly speed at random with replacement from each data set in each epoch.
It is shown to be a powerful and competitive strategy with significant and competitive performance on real-world industrial scale.
arXiv Detail & Related papers (2023-11-21T17:03:21Z) - The Lipschitz-Variance-Margin Tradeoff for Enhanced Randomized Smoothing [85.85160896547698]
Real-life applications of deep neural networks are hindered by their unsteady predictions when faced with noisy inputs and adversarial attacks.
We show how to design an efficient classifier with a certified radius by relying on noise injection into the inputs.
Our novel certification procedure allows us to use pre-trained models with randomized smoothing, effectively improving the current certification radius in a zero-shot manner.
arXiv Detail & Related papers (2023-09-28T22:41:47Z) - [Re] Double Sampling Randomized Smoothing [2.6763498831034043]
This paper is a contribution to the challenge in the field of machine learning, specifically addressing the issue of certifying the robustness of neural networks (NNs) against adversarial perturbations.
The proposed Double Sampling Randomized Smoothing (DSRS) framework overcomes the limitations of existing methods by using an additional smoothing distribution to improve the robustness certification.
arXiv Detail & Related papers (2023-06-27T05:46:18Z) - DAPDAG: Domain Adaptation via Perturbed DAG Reconstruction [78.76115370275733]
We learn an auto-encoder that undertakes inference on population statistics given features and reconstruct a directed acyclic graph (DAG) as an auxiliary task.
The underlying DAG structure is assumed invariant among observed variables whose conditional distributions are allowed to vary across domains led by a latent environmental variable $E$.
We train the encoder and decoder jointly in an end-to-end manner and conduct experiments on synthetic and real datasets with mixed variables.
arXiv Detail & Related papers (2022-08-02T11:43:03Z) - Double Sampling Randomized Smoothing [19.85592163703077]
We propose a Double Sampling Randomized Smoothing framework.
It exploits the sampled probability from an additional smoothing distribution to tighten the robustness certification of the previous smoothed classifier.
We show that DSRS certifies larger robust radii than existing datasets consistently under different settings.
arXiv Detail & Related papers (2022-06-16T04:34:28Z) - Improving Differentially Private SGD via Randomly Sparsified Gradients [31.295035726077366]
Differentially private gradient observation (DP-SGD) has been widely adopted in deep learning to provide rigorously defined privacy bound compression.
We propose an and utilize RS to strengthen communication cost and strengthen privacy bound compression.
arXiv Detail & Related papers (2021-12-01T21:43:34Z) - ANCER: Anisotropic Certification via Sample-wise Volume Maximization [134.7866967491167]
We introduce ANCER, a framework for obtaining anisotropic certificates for a given test set sample via volume.
Results demonstrate that ANCER introduces accuracy on both CIFAR-10 and ImageNet at multiple radii, while certifying substantially larger regions in terms of volume.
arXiv Detail & Related papers (2021-07-09T17:42:38Z) - Black-Box Certification with Randomized Smoothing: A Functional
Optimization Based Framework [60.981406394238434]
We propose a general framework of adversarial certification with non-Gaussian noise and for more general types of attacks.
Our proposed methods achieve better certification results than previous works and provide a new perspective on randomized smoothing certification.
arXiv Detail & Related papers (2020-02-21T07:52:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.