Analyzing the tree-layer structure of Deep Forests
- URL: http://arxiv.org/abs/2010.15690v3
- Date: Thu, 14 Oct 2021 08:25:49 GMT
- Title: Analyzing the tree-layer structure of Deep Forests
- Authors: Ludovic Arnould (LPSM (UMR\_8001)), Claire Boyer (LPSM (UMR\_8001)),
Erwan Scornet (CMAP), Sorbonne Lpsm
- Abstract summary: In this paper, our aim is not to benchmark DF performances but to investigate instead their underlying mechanisms.
We exhibit a theoretical framework in which a shallow tree network is shown to enhance the performance of classical decision trees.
These theoretical results show the interest of tree-network architectures for well-structured data provided that the first layer, acting as a data encoder, is rich enough.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Random forests on the one hand, and neural networks on the other hand, have
met great success in the machine learning community for their predictive
performance. Combinations of both have been proposed in the literature, notably
leading to the so-called deep forests (DF) (Zhou \& Feng,2019). In this paper,
our aim is not to benchmark DF performances but to investigate instead their
underlying mechanisms. Additionally, DF architecture can be generally
simplified into more simple and computationally efficient shallow forest
networks. Despite some instability, the latter may outperform standard
predictive tree-based methods. We exhibit a theoretical framework in which a
shallow tree network is shown to enhance the performance of classical decision
trees. In such a setting, we provide tight theoretical lower and upper bounds
on its excess risk. These theoretical results show the interest of tree-network
architectures for well-structured data provided that the first layer, acting as
a data encoder, is rich enough.
Related papers
- ARTree: A Deep Autoregressive Model for Phylogenetic Inference [6.935130578959931]
We propose a deep autoregressive model for phylogenetic inference based on graph neural networks (GNNs)
We demonstrate the effectiveness and efficiency of our method on a benchmark of challenging real data tree topology density estimation and variational phylogenetic inference problems.
arXiv Detail & Related papers (2023-10-14T10:26:03Z) - Constructing Phylogenetic Networks via Cherry Picking and Machine
Learning [0.1045050906735615]
Existing methods are computationally expensive and can either handle only small numbers of phylogenetic trees or are limited to severely restricted classes of networks.
We apply the recently-introduced theoretical framework of cherry picking to design a class of efficients that are guaranteed to produce a network containing each of the input trees.
We also propose simple and fast randomiseds that prove to be very effective when run multiple times.
arXiv Detail & Related papers (2023-03-31T15:04:42Z) - Pushing the Efficiency Limit Using Structured Sparse Convolutions [82.31130122200578]
We propose Structured Sparse Convolution (SSC), which leverages the inherent structure in images to reduce the parameters in the convolutional filter.
We show that SSC is a generalization of commonly used layers (depthwise, groupwise and pointwise convolution) in efficient architectures''
Architectures based on SSC achieve state-of-the-art performance compared to baselines on CIFAR-10, CIFAR-100, Tiny-ImageNet, and ImageNet classification benchmarks.
arXiv Detail & Related papers (2022-10-23T18:37:22Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - Making CNNs Interpretable by Building Dynamic Sequential Decision
Forests with Top-down Hierarchy Learning [62.82046926149371]
We propose a generic model transfer scheme to make Convlutional Neural Networks (CNNs) interpretable.
We achieve this by building a differentiable decision forest on top of CNNs.
We name the transferred model deep Dynamic Sequential Decision Forest (dDSDF)
arXiv Detail & Related papers (2021-06-05T07:41:18Z) - Learning Structures for Deep Neural Networks [99.8331363309895]
We propose to adopt the efficient coding principle, rooted in information theory and developed in computational neuroscience.
We show that sparse coding can effectively maximize the entropy of the output signals.
Our experiments on a public image classification dataset demonstrate that using the structure learned from scratch by our proposed algorithm, one can achieve a classification accuracy comparable to the best expert-designed structure.
arXiv Detail & Related papers (2021-05-27T12:27:24Z) - Growing Deep Forests Efficiently with Soft Routing and Learned
Connectivity [79.83903179393164]
This paper further extends the deep forest idea in several important aspects.
We employ a probabilistic tree whose nodes make probabilistic routing decisions, a.k.a., soft routing, rather than hard binary decisions.
Experiments on the MNIST dataset demonstrate that our empowered deep forests can achieve better or comparable performance than [1],[3].
arXiv Detail & Related papers (2020-12-29T18:05:05Z) - Deep tree-ensembles for multi-output prediction [0.0]
We propose a novel deep tree-ensemble (DTE) model, where every layer enriches the original feature set with a representation learning component based on tree-embeddings.
We specifically focus on two structured output prediction tasks, namely multi-label classification and multi-target regression.
arXiv Detail & Related papers (2020-11-03T16:25:54Z) - MurTree: Optimal Classification Trees via Dynamic Programming and Search [61.817059565926336]
We present a novel algorithm for learning optimal classification trees based on dynamic programming and search.
Our approach uses only a fraction of the time required by the state-of-the-art and can handle datasets with tens of thousands of instances.
arXiv Detail & Related papers (2020-07-24T17:06:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.