Divide, Conquer, Combine Bayesian Decision Tree Sampling
- URL: http://arxiv.org/abs/2403.18147v1
- Date: Tue, 26 Mar 2024 23:14:15 GMT
- Title: Divide, Conquer, Combine Bayesian Decision Tree Sampling
- Authors: Jodie A. Cochrane, Adrian Wills, Sarah J. Johnson,
- Abstract summary: Decision trees are commonly used predictive models due to their flexibility and interpretability.
This paper is directed at quantifying the uncertainty of decision tree predictions by employing a Bayesian inference approach.
- Score: 1.1879716317856945
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Decision trees are commonly used predictive models due to their flexibility and interpretability. This paper is directed at quantifying the uncertainty of decision tree predictions by employing a Bayesian inference approach. This is challenging because these approaches need to explore both the tree structure space and the space of decision parameters associated with each tree structure. This has been handled by using Markov Chain Monte Carlo (MCMC) methods, where a Markov Chain is constructed to provide samples from the desired Bayesian estimate. Importantly, the structure and the decision parameters are tightly coupled; small changes in the tree structure can demand vastly different decision parameters to provide accurate predictions. A challenge for existing MCMC approaches is proposing joint changes in both the tree structure and the decision parameters that result in efficient sampling. This paper takes a different approach, where each distinct tree structure is associated with a unique set of decision parameters. The proposed approach, entitled DCC-Tree, is inspired by the work in Zhou et al. [23] for probabilistic programs and Cochrane et al. [4] for Hamiltonian Monte Carlo (HMC) based sampling for decision trees. Results show that DCC-Tree performs comparably to other HMC-based methods and better than existing Bayesian tree methods while improving on consistency and reducing the per-proposal complexity.
Related papers
- Learning accurate and interpretable decision trees [27.203303726977616]
We develop approaches to design decision tree learning algorithms given repeated access to data from the same domain.
We study the sample complexity of tuning prior parameters in Bayesian decision tree learning, and extend our results to decision tree regression.
We also study the interpretability of the learned decision trees and introduce a data-driven approach for optimizing the explainability versus accuracy trade-off using decision trees.
arXiv Detail & Related papers (2024-05-24T20:10:10Z) - An Algorithmic Framework for Constructing Multiple Decision Trees by
Evaluating Their Combination Performance Throughout the Construction Process [1.8749305679160366]
Predictions using a combination of decision trees are known to be effective in machine learning.
We propose a new algorithmic framework that constructs decision trees simultaneously and evaluates their combination performance.
arXiv Detail & Related papers (2024-02-09T14:58:07Z) - Boosting-Based Sequential Meta-Tree Ensemble Construction for Improved
Decision Trees [1.8749305679160366]
A decision tree is one of the most popular approaches in machine learning fields.
A meta-tree is recently proposed to solve the problem of overfitting caused by overly deepened trees.
The meta-tree guarantees statistical optimality based on Bayes decision theory.
arXiv Detail & Related papers (2024-02-09T13:08:21Z) - Learning a Decision Tree Algorithm with Transformers [75.96920867382859]
We introduce MetaTree, a transformer-based model trained via meta-learning to directly produce strong decision trees.
We fit both greedy decision trees and globally optimized decision trees on a large number of datasets, and train MetaTree to produce only the trees that achieve strong generalization performance.
arXiv Detail & Related papers (2024-02-06T07:40:53Z) - RJHMC-Tree for Exploration of the Bayesian Decision Tree Posterior [1.3351610617039973]
This paper is directed towards learning decision trees from data using a Bayesian approach.
It investigates using a Hamiltonian Monte Carlo (HMC) approach to explore the posterior of Bayesian decision trees more efficiently.
arXiv Detail & Related papers (2023-12-04T02:23:32Z) - Bayesian Decision Trees Inspired from Evolutionary Algorithms [64.80360020499555]
We propose a replacement of the Markov Chain Monte Carlo (MCMC) with an inherently parallel algorithm, the Sequential Monte Carlo (SMC)
Experiments show that SMC combined with the Evolutionary Algorithms (EA) can produce more accurate results compared to MCMC in 100 times fewer iterations.
arXiv Detail & Related papers (2023-05-30T06:17:35Z) - Bayesian Decision Trees via Tractable Priors and Probabilistic
Context-Free Grammars [7.259767735431625]
We propose a new criterion for training Bayesian Decision Trees.
BCART-PCFG can efficiently sample decision trees from a posterior distribution across trees given the data.
We find that trees sampled via BCART-PCFG perform comparable to or better than greedily-constructed Decision Trees.
arXiv Detail & Related papers (2023-02-15T00:17:41Z) - Contextual Decision Trees [62.997667081978825]
We propose a multi-armed contextual bandit recommendation framework for feature-based selection of a single shallow tree of the learned ensemble.
The trained system, which works on top of the Random Forest, dynamically identifies a base predictor that is responsible for providing the final output.
arXiv Detail & Related papers (2022-07-13T17:05:08Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Convex Polytope Trees [57.56078843831244]
convex polytope trees (CPT) are proposed to expand the family of decision trees by an interpretable generalization of their decision boundary.
We develop a greedy method to efficiently construct CPT and scalable end-to-end training algorithms for the tree parameters when the tree structure is given.
arXiv Detail & Related papers (2020-10-21T19:38:57Z) - Generalized and Scalable Optimal Sparse Decision Trees [56.35541305670828]
We present techniques that produce optimal decision trees over a variety of objectives.
We also introduce a scalable algorithm that produces provably optimal results in the presence of continuous variables.
arXiv Detail & Related papers (2020-06-15T19:00:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.