Distributional Gradient Boosting Machines
- URL: http://arxiv.org/abs/2204.00778v1
- Date: Sat, 2 Apr 2022 06:32:19 GMT
- Title: Distributional Gradient Boosting Machines
- Authors: Alexander M\"arz, Thomas Kneib
- Abstract summary: Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
- Score: 77.34726150561087
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a unified probabilistic gradient boosting framework for regression
tasks that models and predicts the entire conditional distribution of a
univariate response variable as a function of covariates. Our likelihood-based
approach allows us to either model all conditional moments of a parametric
distribution, or to approximate the conditional cumulative distribution
function via Normalizing Flows. As underlying computational backbones, our
framework is based on XGBoost and LightGBM. Modelling and predicting the entire
conditional distribution greatly enhances existing tree-based gradient boosting
implementations, as it allows to create probabilistic forecasts from which
prediction intervals and quantiles of interest can be derived. Empirical
results show that our framework achieves state-of-the-art forecast accuracy.
Related papers
- Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Probabilistic Forecasting with Stochastic Interpolants and Föllmer Processes [18.344934424278048]
We propose a framework for probabilistic forecasting of dynamical systems based on generative modeling.
We show that the drift and the diffusion coefficients of this SDE can be adjusted after training, and that a specific choice that minimizes the impact of the estimation error gives a F"ollmer process.
arXiv Detail & Related papers (2024-03-20T16:33:06Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Predicting conditional probability distributions of redshifts of Active
Galactic Nuclei using Hierarchical Correlation Reconstruction [0.8702432681310399]
This article applies Hierarchical Correlation Reconstruction (HCR) approach to inexpensively predict conditional probability distributions.
We get interpretable models: with coefficients describing contributions of features to conditional moments.
This article extends on the original approach especially by using Canonical Correlation Analysis (CCA) for feature optimization and l1 "lasso" regularization.
arXiv Detail & Related papers (2022-06-13T14:28:53Z) - Transforming Autoregression: Interpretable and Expressive Time Series
Forecast [0.0]
We propose Autoregressive Transformation Models (ATMs), a model class inspired from various research directions.
ATMs unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification.
We demonstrate the properties of ATMs both theoretically and through empirical evaluation on several simulated and real-world forecasting datasets.
arXiv Detail & Related papers (2021-10-15T17:58:49Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Multivariate Boosted Trees and Applications to Forecasting and Control [0.0]
Gradient boosted trees are non-parametric regressors that exploit sequential model fitting and gradient descent to minimize a specific loss function.
In this paper, we present a computationally efficient algorithm for fitting multivariate boosted trees.
arXiv Detail & Related papers (2020-03-08T19:26:59Z) - CatBoostLSS -- An extension of CatBoost to probabilistic forecasting [91.3755431537592]
We propose a new framework that predicts the entire conditional distribution of a univariable response variable.
CatBoostLSS models all moments of a parametric distribution instead of the conditional mean only.
We present both a simulation study and real-world examples that demonstrate the benefits of our approach.
arXiv Detail & Related papers (2020-01-04T15:42:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.