On Polynomial Approximations for Privacy-Preserving and Verifiable ReLU
Networks
- URL: http://arxiv.org/abs/2011.05530v4
- Date: Tue, 6 Feb 2024 08:29:25 GMT
- Title: On Polynomial Approximations for Privacy-Preserving and Verifiable ReLU
Networks
- Authors: Ramy E. Ali, Jinhyun So, A. Salman Avestimehr
- Abstract summary: We propose a degree-2 activation function with a first order term and empirically show that it can lead to much better models.
Our proposed function improves the test accuracy by up to 10.4% compared to the square function.
- Score: 6.130998208629276
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Outsourcing deep neural networks (DNNs) inference tasks to an untrusted cloud
raises data privacy and integrity concerns. While there are many techniques to
ensure privacy and integrity for polynomial-based computations, DNNs involve
non-polynomial computations. To address these challenges, several
privacy-preserving and verifiable inference techniques have been proposed based
on replacing the non-polynomial activation functions such as the rectified
linear unit (ReLU) function with polynomial activation functions. Such
techniques usually require polynomials with integer coefficients or polynomials
over finite fields. Motivated by such requirements, several works proposed
replacing the ReLU function with the square function. In this work, we
empirically show that the square function is not the best degree-2 polynomial
that can replace the ReLU function even when restricting the polynomials to
have integer coefficients. We instead propose a degree-2 polynomial activation
function with a first order term and empirically show that it can lead to much
better models. Our experiments on the CIFAR and Tiny ImageNet datasets on
various architectures such as VGG-16 show that our proposed function improves
the test accuracy by up to 10.4% compared to the square function.
Related papers
- Polynomial-Augmented Neural Networks (PANNs) with Weak Orthogonality Constraints for Enhanced Function and PDE Approximation [22.689531776611084]
We present-augmented neural networks (PANNs)
We present a novel machine learning architecture that combines deep neural networks (DNNs) with an approximant.
arXiv Detail & Related papers (2024-06-04T14:06:15Z) - Exploring the Potential of Polynomial Basis Functions in Kolmogorov-Arnold Networks: A Comparative Study of Different Groups of Polynomials [0.0]
This paper presents a survey of 18 distincts and their potential applications in Gottliebmogorov Network (KAN) models.
The study aims to investigate the suitability of theseDistincts as basis functions in KAN models for complex tasks like handwritten digit classification.
The performance metrics of the KAN models, including overall accuracy, Kappa, and F1 score, are evaluated and compared.
arXiv Detail & Related papers (2024-05-30T20:40:16Z) - A Simple Solution for Homomorphic Evaluation on Large Intervals [0.0]
Homomorphic encryption (HE) is a promising technique used for privacy-preserving computation.
Homomorphic evaluation of approximations for non-polynomial functions plays an important role in privacy-preserving machine learning.
We introduce a simple solution to approximating any functions, which might be overmissed by researchers.
arXiv Detail & Related papers (2024-05-24T04:13:22Z) - Neural Fields with Thermal Activations for Arbitrary-Scale Super-Resolution [56.089473862929886]
We present a novel way to design neural fields such that points can be queried with an adaptive Gaussian PSF.
With its theoretically guaranteed anti-aliasing, our method sets a new state of the art for arbitrary-scale single image super-resolution.
arXiv Detail & Related papers (2023-11-29T14:01:28Z) - Piecewise Polynomial Regression of Tame Functions via Integer Programming [2.2499166814992435]
We consider tame functions, nonsmooth functions with all common activations, value functions of mixed-integer programs, or wave functions of small molecules.
arXiv Detail & Related papers (2023-11-22T17:37:42Z) - Regularization of polynomial networks for image recognition [78.4786845859205]
Polynomial Networks (PNs) have emerged as an alternative method with a promising performance and improved interpretability.
We introduce a class of PNs, which are able to reach the performance of ResNet across a range of six benchmarks.
arXiv Detail & Related papers (2023-03-24T10:05:22Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations
in Privacy-Preserving Deep Learning [2.5677092608889773]
Privacy concerns in client-server machine learning have given rise to private inference (PI), where neural inference occurs directly on encrypted inputs.
We ask: Is it feasible to substitute all ReLUs with low-degree bandwidth activation functions for building deep, privacy-friendly neural networks?
We analyze challenges of substituting ReLUs with PIs, starting with simple drop-and-replace solutions to novel, more involved replace-and-retrain strategies.
arXiv Detail & Related papers (2021-07-26T17:33:56Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Computational Barriers to Estimation from Low-Degree Polynomials [81.67886161671379]
We study the power of low-degrees for the task of detecting the presence of hidden structures.
For a large class of "signal plus noise" problems, we give a user-friendly lower bound for the best possible mean squared error achievable by any degree.
As applications, we give a tight characterization of the low-degree minimum mean squared error for the planted submatrix and planted dense subgraph problems.
arXiv Detail & Related papers (2020-08-05T17:52:10Z) - From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models [82.95892656532696]
Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
arXiv Detail & Related papers (2020-06-01T22:20:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.