Variable Skipping for Autoregressive Range Density Estimation
- URL: http://arxiv.org/abs/2007.05572v1
- Date: Fri, 10 Jul 2020 19:01:40 GMT
- Title: Variable Skipping for Autoregressive Range Density Estimation
- Authors: Eric Liang, Zongheng Yang, Ion Stoica, Pieter Abbeel, Yan Duan, Xi
Chen
- Abstract summary: We show a technique, variable skipping, for accelerating range density estimation over deep autoregressive models.
We show that variable skipping provides 10-100$times$ efficiency improvements when targeting challenging high-quantile error metrics.
- Score: 84.60428050170687
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep autoregressive models compute point likelihood estimates of individual
data points. However, many applications (i.e., database cardinality estimation)
require estimating range densities, a capability that is under-explored by
current neural density estimation literature. In these applications, fast and
accurate range density estimates over high-dimensional data directly impact
user-perceived performance. In this paper, we explore a technique, variable
skipping, for accelerating range density estimation over deep autoregressive
models. This technique exploits the sparse structure of range density queries
to avoid sampling unnecessary variables during approximate inference. We show
that variable skipping provides 10-100$\times$ efficiency improvements when
targeting challenging high-quantile error metrics, enables complex applications
such as text pattern matching, and can be realized via a simple data
augmentation procedure without changing the usual maximum likelihood objective.
Related papers
- Conformalized High-Density Quantile Regression via Dynamic Prototypes-based Probability Density Estimation [2.526146573337397]
We introduce a conformalized high-density quantile regression approach with a dynamically adaptive set of prototypes.
Our method optimize the set of prototypes by adaptively adding, deleting, and relocating quantization bins.
Experiments across diverse datasets and dimensionalities confirm that our method consistently achieves high-quality prediction regions.
arXiv Detail & Related papers (2024-11-02T14:36:12Z) - Density-Regression: Efficient and Distance-Aware Deep Regressor for
Uncertainty Estimation under Distribution Shifts [11.048463491646993]
Density-Regression is a method that leverages the density function in uncertainty estimation and achieves fast inference by a single forward pass.
We show that Density-Regression has competitive uncertainty estimation performance under distribution shifts with modern deep regressors.
arXiv Detail & Related papers (2024-03-07T23:20:34Z) - Smooth densities and generative modeling with unsupervised random
forests [1.433758865948252]
An important application for density estimators is synthetic data generation.
We propose a new method based on unsupervised random forests for estimating smooth densities in arbitrary dimensions without parametric constraints.
We prove the consistency of our approach and demonstrate its advantages over existing tree-based density estimators.
arXiv Detail & Related papers (2022-05-19T09:50:25Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Evaluating Prediction-Time Batch Normalization for Robustness under
Covariate Shift [81.74795324629712]
We call prediction-time batch normalization, which significantly improves model accuracy and calibration under covariate shift.
We show that prediction-time batch normalization provides complementary benefits to existing state-of-the-art approaches for improving robustness.
The method has mixed results when used alongside pre-training, and does not seem to perform as well under more natural types of dataset shift.
arXiv Detail & Related papers (2020-06-19T05:08:43Z) - Anomaly Detection in Trajectory Data with Normalizing Flows [0.0]
We propose an approach based on normalizing flows that enables complex density estimation from data with neural networks.
Our proposal computes exact model likelihood values, an important feature of normalizing flows, for each segment of the trajectory.
We evaluate our methodology, named aggregated anomaly detection with normalizing flows (GRADINGS), using real world trajectory data and compare it with more traditional anomaly detection techniques.
arXiv Detail & Related papers (2020-04-13T14:16:40Z) - TraDE: Transformers for Density Estimation [101.20137732920718]
TraDE is a self-attention-based architecture for auto-regressive density estimation.
We present a suite of tasks such as regression using generated samples, out-of-distribution detection, and robustness to noise in the training data.
arXiv Detail & Related papers (2020-04-06T07:32:51Z) - SUMO: Unbiased Estimation of Log Marginal Probability for Latent
Variable Models [80.22609163316459]
We introduce an unbiased estimator of the log marginal likelihood and its gradients for latent variable models based on randomized truncation of infinite series.
We show that models trained using our estimator give better test-set likelihoods than a standard importance-sampling based approach for the same average computational cost.
arXiv Detail & Related papers (2020-04-01T11:49:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.