Sigma-Delta and Distributed Noise-Shaping Quantization Methods for
Random Fourier Features
- URL: http://arxiv.org/abs/2106.02614v1
- Date: Fri, 4 Jun 2021 17:24:47 GMT
- Title: Sigma-Delta and Distributed Noise-Shaping Quantization Methods for
Random Fourier Features
- Authors: Jinjie Zhang, Alexander Cloninger, Rayan Saab
- Abstract summary: We prove that our quantized RFFs allow a high accuracy approximation of the underlying kernels.
We show that the quantized RFFs can be further compressed, yielding an excellent trade-off between memory use and accuracy.
We empirically show by testing the performance of our methods on several machine learning tasks that our method compares favorably to other state of the art quantization methods in this context.
- Score: 73.25551965751603
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the use of low bit-depth Sigma-Delta and distributed noise-shaping
methods for quantizing the Random Fourier features (RFFs) associated with
shift-invariant kernels. We prove that our quantized RFFs -- even in the case
of $1$-bit quantization -- allow a high accuracy approximation of the
underlying kernels, and the approximation error decays at least polynomially
fast as the dimension of the RFFs increases. We also show that the quantized
RFFs can be further compressed, yielding an excellent trade-off between memory
use and accuracy. Namely, the approximation error now decays exponentially as a
function of the bits used. Moreover, we empirically show by testing the
performance of our methods on several machine learning tasks that our method
compares favorably to other state of the art quantization methods in this
context.
Related papers
- Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - On Optimal Sampling for Learning SDF Using MLPs Equipped with Positional
Encoding [79.67071790034609]
We devise a tool to determine the appropriate sampling rate for learning an accurate neural implicit field without undesirable side effects.
It is observed that a PE-equipped has an intrinsic frequency much higher than the highest frequency component in the PE layer.
We empirically show in the setting of SDF fitting, this recommended sampling rate is sufficient to secure accurate fitting results.
arXiv Detail & Related papers (2024-01-02T10:51:52Z) - Neural Fields with Thermal Activations for Arbitrary-Scale Super-Resolution [56.089473862929886]
We present a novel way to design neural fields such that points can be queried with an adaptive Gaussian PSF.
With its theoretically guaranteed anti-aliasing, our method sets a new state of the art for arbitrary-scale single image super-resolution.
arXiv Detail & Related papers (2023-11-29T14:01:28Z) - Trigonometric Quadrature Fourier Features for Scalable Gaussian Process
Regression [3.577968559443225]
Quadrature Fourier Features (QFF) have gained popularity in recent years due to their improved approximation accuracy and better calibrated uncertainty estimates.
A key limitation of QFF is that its performance can suffer from well-known pathologies related to highly oscillatory quadrature.
We address this critical issue via a new Trigonometric Quadrature Fourier Feature (TQFF) method, which uses a novel non-Gaussian quadrature rule.
arXiv Detail & Related papers (2023-10-23T03:53:09Z) - Towards Accurate Post-training Quantization for Diffusion Models [73.19871905102545]
We propose an accurate data-free post-training quantization framework of diffusion models (ADP-DM) for efficient image generation.
Our method outperforms the state-of-the-art post-training quantization of diffusion model by a sizable margin with similar computational cost.
arXiv Detail & Related papers (2023-05-30T04:00:35Z) - Stochastic optimal transport in Banach Spaces for regularized estimation
of multivariate quantiles [0.0]
We introduce a new algorithm for solving entropic optimal transport (EOT) between two absolutely continuous probability measures $mu$ and $nu$.
We study the almost sure convergence of our algorithm that takes its values in an infinite-dimensional Banach space.
arXiv Detail & Related papers (2023-02-02T10:02:01Z) - Efficient Fourier single-pixel imaging with Gaussian random sampling [1.2355696607086075]
We propose a new sampling strategy for Fourier single-pixel imaging (FSI)
It allows FSI to reconstruct a clear and sharp image with a reduced number of measurements.
We experimentally demonstrate compressive FSI combined with the proposed sampling strategy is able to reconstruct a sharp and clear image of 256-by-256 pixels with a sampling ratio of 10%.
arXiv Detail & Related papers (2021-06-29T01:23:33Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Variational Bayesian Quantization [31.999462074510305]
We propose a novel algorithm for quantizing continuous latent representations in trained models.
Unlike current end-to-end neural compression methods that cater the model to a fixed quantization scheme, our algorithm separates model design and training from quantization.
Our algorithm can be seen as a novel extension of arithmetic coding to the continuous domain.
arXiv Detail & Related papers (2020-02-18T00:15:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.