Tree in Tree: from Decision Trees to Decision Graphs
- URL: http://arxiv.org/abs/2110.00392v2
- Date: Tue, 5 Oct 2021 11:12:41 GMT
- Title: Tree in Tree: from Decision Trees to Decision Graphs
- Authors: Bingzhao Zhu, Mahsa Shoaran
- Abstract summary: Tree in Tree decision graph (TnT) is a framework that extends the conventional decision tree to a more generic and powerful directed acyclic graph.
Our proposed model is a novel, more efficient, and accurate alternative to the widely-used decision trees.
- Score: 2.2336243882030025
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Decision trees have been widely used as classifiers in many machine learning
applications thanks to their lightweight and interpretable decision process.
This paper introduces Tree in Tree decision graph (TnT), a framework that
extends the conventional decision tree to a more generic and powerful directed
acyclic graph. TnT constructs decision graphs by recursively growing decision
trees inside the internal or leaf nodes instead of greedy training. The time
complexity of TnT is linear to the number of nodes in the graph, and it can
construct decision graphs on large datasets. Compared to decision trees, we
show that TnT achieves better classification performance with reduced model
size, both as a stand-alone classifier and as a base estimator in
bagging/AdaBoost ensembles. Our proposed model is a novel, more efficient, and
accurate alternative to the widely-used decision trees.
Related papers
- Learning a Decision Tree Algorithm with Transformers [75.96920867382859]
We introduce MetaTree, a transformer-based model trained via meta-learning to directly produce strong decision trees.
We fit both greedy decision trees and globally optimized decision trees on a large number of datasets, and train MetaTree to produce only the trees that achieve strong generalization performance.
arXiv Detail & Related papers (2024-02-06T07:40:53Z) - Linear TreeShap [16.246232737115218]
Decision trees are well-known due to their ease of interpretability.
To improve accuracy, we need to grow deep trees or ensembles of trees.
This paper presents a more efficient and straightforward algorithm: Linear TreeShap.
arXiv Detail & Related papers (2022-09-16T23:17:15Z) - Social Interpretable Tree for Pedestrian Trajectory Prediction [75.81745697967608]
We propose a tree-based method, termed as Social Interpretable Tree (SIT), to address this multi-modal prediction task.
A path in the tree from the root to leaf represents an individual possible future trajectory.
Despite the hand-crafted tree, the experimental results on ETH-UCY and Stanford Drone datasets demonstrate that our method is capable of matching or exceeding the performance of state-of-the-art methods.
arXiv Detail & Related papers (2022-05-26T12:18:44Z) - SONG: Self-Organizing Neural Graphs [10.253870280561609]
Decision trees are easy to interpret since they are based on binary decisions, they can make decisions faster, and they provide a hierarchy of classes.
One of the well-known drawbacks of decision trees, as compared to decision graphs, is that decision trees cannot reuse the decision nodes.
In this paper, we provide a general paradigm based on Markov processes, which allows for efficient training of the special type of decision graphs, which we call Self-Organizing Neural Graphs (SONG)
arXiv Detail & Related papers (2021-07-28T07:53:53Z) - TD-GEN: Graph Generation With Tree Decomposition [31.751200416677225]
TD-GEN is a graph generation framework based on tree decomposition.
Tree nodes are supernodes, each representing a cluster of nodes in the graph.
arXiv Detail & Related papers (2021-06-20T08:57:43Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Growing Deep Forests Efficiently with Soft Routing and Learned
Connectivity [79.83903179393164]
This paper further extends the deep forest idea in several important aspects.
We employ a probabilistic tree whose nodes make probabilistic routing decisions, a.k.a., soft routing, rather than hard binary decisions.
Experiments on the MNIST dataset demonstrate that our empowered deep forests can achieve better or comparable performance than [1],[3].
arXiv Detail & Related papers (2020-12-29T18:05:05Z) - Convex Polytope Trees [57.56078843831244]
convex polytope trees (CPT) are proposed to expand the family of decision trees by an interpretable generalization of their decision boundary.
We develop a greedy method to efficiently construct CPT and scalable end-to-end training algorithms for the tree parameters when the tree structure is given.
arXiv Detail & Related papers (2020-10-21T19:38:57Z) - Succinct Explanations With Cascading Decision Trees [5.877164140116815]
We propose a new decision tree model that we call Cascading Decision Trees.
Our key insight is to separate the notion of a decision path and an explanation path.
Applying cascading decision trees to new samples results in a significantly shorter and succinct explanation.
arXiv Detail & Related papers (2020-10-13T18:48:39Z) - MurTree: Optimal Classification Trees via Dynamic Programming and Search [61.817059565926336]
We present a novel algorithm for learning optimal classification trees based on dynamic programming and search.
Our approach uses only a fraction of the time required by the state-of-the-art and can handle datasets with tens of thousands of instances.
arXiv Detail & Related papers (2020-07-24T17:06:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.