Boosting-Based Sequential Meta-Tree Ensemble Construction for Improved
Decision Trees
- URL: http://arxiv.org/abs/2402.06386v1
- Date: Fri, 9 Feb 2024 13:08:21 GMT
- Title: Boosting-Based Sequential Meta-Tree Ensemble Construction for Improved
Decision Trees
- Authors: Ryota Maniwa, Naoki Ichijo, Yuta Nakahara, and Toshiyasu Matsushima
- Abstract summary: A decision tree is one of the most popular approaches in machine learning fields.
A meta-tree is recently proposed to solve the problem of overfitting caused by overly deepened trees.
The meta-tree guarantees statistical optimality based on Bayes decision theory.
- Score: 1.8749305679160366
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A decision tree is one of the most popular approaches in machine learning
fields. However, it suffers from the problem of overfitting caused by overly
deepened trees. Then, a meta-tree is recently proposed. It solves the problem
of overfitting caused by overly deepened trees. Moreover, the meta-tree
guarantees statistical optimality based on Bayes decision theory. Therefore,
the meta-tree is expected to perform better than the decision tree. In contrast
to a single decision tree, it is known that ensembles of decision trees, which
are typically constructed boosting algorithms, are more effective in improving
predictive performance. Thus, it is expected that ensembles of meta-trees are
more effective in improving predictive performance than a single meta-tree, and
there are no previous studies that construct multiple meta-trees in boosting.
Therefore, in this study, we propose a method to construct multiple meta-trees
using a boosting approach. Through experiments with synthetic and benchmark
datasets, we conduct a performance comparison between the proposed methods and
the conventional methods using ensembles of decision trees. Furthermore, while
ensembles of decision trees can cause overfitting as well as a single decision
tree, experiments confirmed that ensembles of meta-trees can prevent
overfitting due to the tree depth.
Related papers
- Divide, Conquer, Combine Bayesian Decision Tree Sampling [1.1879716317856945]
Decision trees are commonly used predictive models due to their flexibility and interpretability.
This paper is directed at quantifying the uncertainty of decision tree predictions by employing a Bayesian inference approach.
arXiv Detail & Related papers (2024-03-26T23:14:15Z) - An Algorithmic Framework for Constructing Multiple Decision Trees by
Evaluating Their Combination Performance Throughout the Construction Process [1.8749305679160366]
Predictions using a combination of decision trees are known to be effective in machine learning.
We propose a new algorithmic framework that constructs decision trees simultaneously and evaluates their combination performance.
arXiv Detail & Related papers (2024-02-09T14:58:07Z) - Learning a Decision Tree Algorithm with Transformers [75.96920867382859]
We introduce MetaTree, a transformer-based model trained via meta-learning to directly produce strong decision trees.
We fit both greedy decision trees and globally optimized decision trees on a large number of datasets, and train MetaTree to produce only the trees that achieve strong generalization performance.
arXiv Detail & Related papers (2024-02-06T07:40:53Z) - MAPTree: Beating "Optimal" Decision Trees with Bayesian Decision Trees [2.421336072915701]
We present a Bayesian approach to decision tree induction via maximum a posteriori inference of a posterior distribution over trees.
We propose an AND/OR search algorithm, dubbed MAPTree, which is able to recover the maximum a posteriori tree.
arXiv Detail & Related papers (2023-09-26T23:43:37Z) - Permutation Decision Trees [3.089408984959925]
Effort-To-Compress (ETC) is a complexity measure, for the first time, as an alternative impurity measure.
We conduct a performance comparison between Permutation Decision Trees and classical decision trees across various real-world datasets.
arXiv Detail & Related papers (2023-06-05T06:31:14Z) - Social Interpretable Tree for Pedestrian Trajectory Prediction [75.81745697967608]
We propose a tree-based method, termed as Social Interpretable Tree (SIT), to address this multi-modal prediction task.
A path in the tree from the root to leaf represents an individual possible future trajectory.
Despite the hand-crafted tree, the experimental results on ETH-UCY and Stanford Drone datasets demonstrate that our method is capable of matching or exceeding the performance of state-of-the-art methods.
arXiv Detail & Related papers (2022-05-26T12:18:44Z) - Convex Polytope Trees [57.56078843831244]
convex polytope trees (CPT) are proposed to expand the family of decision trees by an interpretable generalization of their decision boundary.
We develop a greedy method to efficiently construct CPT and scalable end-to-end training algorithms for the tree parameters when the tree structure is given.
arXiv Detail & Related papers (2020-10-21T19:38:57Z) - MurTree: Optimal Classification Trees via Dynamic Programming and Search [61.817059565926336]
We present a novel algorithm for learning optimal classification trees based on dynamic programming and search.
Our approach uses only a fraction of the time required by the state-of-the-art and can handle datasets with tens of thousands of instances.
arXiv Detail & Related papers (2020-07-24T17:06:55Z) - Generalized and Scalable Optimal Sparse Decision Trees [56.35541305670828]
We present techniques that produce optimal decision trees over a variety of objectives.
We also introduce a scalable algorithm that produces provably optimal results in the presence of continuous variables.
arXiv Detail & Related papers (2020-06-15T19:00:11Z) - ENTMOOT: A Framework for Optimization over Ensemble Tree Models [57.98561336670884]
ENTMOOT is a framework for integrating tree models into larger optimization problems.
We show how ENTMOOT allows a simple integration of tree models into decision-making and black-box optimization.
arXiv Detail & Related papers (2020-03-10T14:34:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.