Bayesian Additive Regression Trees with Model Trees
- URL: http://arxiv.org/abs/2006.07493v5
- Date: Wed, 10 Mar 2021 16:20:03 GMT
- Title: Bayesian Additive Regression Trees with Model Trees
- Authors: Estev\~ao B. Prado, Rafael A. Moral and Andrew C. Parnell
- Abstract summary: We introduce an extension of BART, called Model Trees BART (MOTR-BART)
MOTR-BART considers piecewise linear functions at node levels instead of piecewise constants.
In our approach, local linearities are captured more efficiently and fewer trees are required to achieve equal or better performance than BART.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian Additive Regression Trees (BART) is a tree-based machine learning
method that has been successfully applied to regression and classification
problems. BART assumes regularisation priors on a set of trees that work as
weak learners and is very flexible for predicting in the presence of
non-linearity and high-order interactions. In this paper, we introduce an
extension of BART, called Model Trees BART (MOTR-BART), that considers
piecewise linear functions at node levels instead of piecewise constants. In
MOTR-BART, rather than having a unique value at node level for the prediction,
a linear predictor is estimated considering the covariates that have been used
as the split variables in the corresponding tree. In our approach, local
linearities are captured more efficiently and fewer trees are required to
achieve equal or better performance than BART. Via simulation studies and real
data applications, we compare MOTR-BART to its main competitors. R code for
MOTR-BART implementation is available at https://github.com/ebprado/MOTR-BART.
Related papers
- On the Gaussian process limit of Bayesian Additive Regression Trees [0.0]
Bayesian Additive Regression Trees (BART) is a nonparametric Bayesian regression technique of rising fame.
In the limit of infinite trees, it becomes equivalent to Gaussian process (GP) regression.
This study opens new ways to understand and develop BART and GP regression.
arXiv Detail & Related papers (2024-10-26T23:18:33Z) - ASBART:Accelerated Soft Bayes Additive Regression Trees [8.476756500467689]
Soft BART improves both practically and heoretically on existing Bayesian sum-of-trees models.
Compared to BART,it use more than about 20 times to complete the calculation with the default setting.
We proposed a variant of BART named accelerate Soft BART(ASBART)
arXiv Detail & Related papers (2023-10-21T11:27:42Z) - flexBART: Flexible Bayesian regression trees with categorical predictors [0.6577148087211809]
Most implementations of Bayesian additive regression trees (BART) one-hot encode categorical predictors, replacing each one with several binary indicators.
We re-implement BART with regression trees that can assign multiple levels to both branches of a decision tree node.
Our re-implementation, which is available in the flexBART package, often yields improved out-of-sample predictive performance and scales better to larger datasets.
arXiv Detail & Related papers (2022-11-08T18:52:37Z) - Lookback for Learning to Branch [77.32867454769936]
Bipartite Graph Neural Networks (GNNs) have been shown to be an important component of deep learning based Mixed-Integer Linear Program (MILP) solvers.
Recent works have demonstrated the effectiveness of such GNNs in replacing the branching (variable selection) in branch-and-bound (B&B) solvers.
arXiv Detail & Related papers (2022-06-30T02:33:32Z) - Social Interpretable Tree for Pedestrian Trajectory Prediction [75.81745697967608]
We propose a tree-based method, termed as Social Interpretable Tree (SIT), to address this multi-modal prediction task.
A path in the tree from the root to leaf represents an individual possible future trajectory.
Despite the hand-crafted tree, the experimental results on ETH-UCY and Stanford Drone datasets demonstrate that our method is capable of matching or exceeding the performance of state-of-the-art methods.
arXiv Detail & Related papers (2022-05-26T12:18:44Z) - GP-BART: a novel Bayesian additive regression trees approach using
Gaussian processes [1.03590082373586]
The GP-BART model is an extension of BART which addresses the limitation by assuming GP priors for the predictions of each terminal node among all trees.
The model's effectiveness is demonstrated through applications to simulated and real-world data, surpassing the performance of traditional modeling approaches in various scenarios.
arXiv Detail & Related papers (2022-04-05T11:18:44Z) - Accounting for shared covariates in semi-parametric Bayesian additive regression trees [0.0]
We propose some extensions to semi-parametric models based on Bayesian additive regression trees (BART)
The main novelty in our approach lies in the way we change the tree-generation moves in BART to deal with this bias.
We show competitive performance when compared to regression models, alternative formulations of semi-parametric BART, and other tree-based methods.
arXiv Detail & Related papers (2021-08-17T13:58:44Z) - Momentum Pseudo-Labeling for Semi-Supervised Speech Recognition [55.362258027878966]
We present momentum pseudo-labeling (MPL) as a simple yet effective strategy for semi-supervised speech recognition.
MPL consists of a pair of online and offline models that interact and learn from each other, inspired by the mean teacher method.
The experimental results demonstrate that MPL effectively improves over the base model and is scalable to different semi-supervised scenarios.
arXiv Detail & Related papers (2021-06-16T16:24:55Z) - Improved Branch and Bound for Neural Network Verification via Lagrangian
Decomposition [161.09660864941603]
We improve the scalability of Branch and Bound (BaB) algorithms for formally proving input-output properties of neural networks.
We present a novel activation-based branching strategy and a BaB framework, named Branch and Dual Network Bound (BaDNB)
BaDNB outperforms previous complete verification systems by a large margin, cutting average verification times by factors up to 50 on adversarial properties.
arXiv Detail & Related papers (2021-04-14T09:22:42Z) - Probabilistic Case-based Reasoning for Open-World Knowledge Graph
Completion [59.549664231655726]
A case-based reasoning (CBR) system solves a new problem by retrieving cases' that are similar to the given problem.
In this paper, we demonstrate that such a system is achievable for reasoning in knowledge-bases (KBs)
Our approach predicts attributes for an entity by gathering reasoning paths from similar entities in the KB.
arXiv Detail & Related papers (2020-10-07T17:48:12Z) - Forest R-CNN: Large-Vocabulary Long-Tailed Object Detection and Instance
Segmentation [75.93960390191262]
We exploit prior knowledge of the relations among object categories to cluster fine-grained classes into coarser parent classes.
We propose a simple yet effective resampling method, NMS Resampling, to re-balance the data distribution.
Our method, termed as Forest R-CNN, can serve as a plug-and-play module being applied to most object recognition models.
arXiv Detail & Related papers (2020-08-13T03:52:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.