Modelling hetegeneous treatment effects by quantitle local polynomial
decision tree and forest
- URL: http://arxiv.org/abs/2111.15320v1
- Date: Tue, 30 Nov 2021 12:02:16 GMT
- Title: Modelling hetegeneous treatment effects by quantitle local polynomial
decision tree and forest
- Authors: Lai Xinglin
- Abstract summary: This paper builds on Breiman's 2001 random forest tree (RFT) and Wager et al.'s causal tree to parameterize the nonparametric problem.
We propose a decision tree using quantile classification according to fixed rules combined with classical estimation of local samples, which we call the quantile local linear causal tree (QLPRT) and forest (QLPRF)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To further develop the statistical inference problem for heterogeneous
treatment effects, this paper builds on Breiman's (2001) random forest tree
(RFT)and Wager et al.'s (2018) causal tree to parameterize the nonparametric
problem using the excellent statistical properties of classical OLS and the
division of local linear intervals based on covariate quantile points, while
preserving the random forest trees with the advantages of constructible
confidence intervals and asymptotic normality properties [Athey and Imbens
(2016),Efron (2014),Wager et al.(2014)\citep{wager2014asymptotic}], we propose
a decision tree using quantile classification according to fixed rules combined
with polynomial estimation of local samples, which we call the quantile local
linear causal tree (QLPRT) and forest (QLPRF).
Related papers
- Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Distilling interpretable causal trees from causal forests [0.0]
A high-dimensional distribution of conditional average treatment effects may give accurate, individual-level estimates.
This paper proposes the Distilled Causal Tree, a method for distilling a single, interpretable causal tree from a causal forest.
arXiv Detail & Related papers (2024-08-02T05:48:15Z) - Statistical Advantages of Oblique Randomized Decision Trees and Forests [0.0]
Generalization error and convergence rates are obtained for the flexible dimension reduction model class of ridge functions.
A lower bound on the risk of axis-aligned Mondrian trees is obtained proving that these estimators are suboptimal for these linear dimension reduction models.
arXiv Detail & Related papers (2024-07-02T17:35:22Z) - Ensembles of Probabilistic Regression Trees [46.53457774230618]
Tree-based ensemble methods have been successfully used for regression problems in many applications and research studies.
We study ensemble versions of probabilisticregression trees that provide smooth approximations of the objective function by assigningeach observation to each region with respect to a probability distribution.
arXiv Detail & Related papers (2024-06-20T06:51:51Z) - PhyloGFN: Phylogenetic inference with generative flow networks [57.104166650526416]
We introduce the framework of generative flow networks (GFlowNets) to tackle two core problems in phylogenetics: parsimony-based and phylogenetic inference.
Because GFlowNets are well-suited for sampling complex structures, they are a natural choice for exploring and sampling from the multimodal posterior distribution over tree topologies.
We demonstrate that our amortized posterior sampler, PhyloGFN, produces diverse and high-quality evolutionary hypotheses on real benchmark datasets.
arXiv Detail & Related papers (2023-10-12T23:46:08Z) - Accelerating Generalized Random Forests with Fixed-Point Trees [2.810283834703862]
Estimators are constructed by leveraging random forests as an adaptive kernel weighting algorithm.
We propose a new tree-growing rule for generalized random forests induced from a fixed-point iteration type of approximation.
arXiv Detail & Related papers (2023-06-20T21:45:35Z) - On the Pointwise Behavior of Recursive Partitioning and Its Implications
for Heterogeneous Causal Effect Estimation [8.394633341978007]
Decision tree learning is increasingly being used for pointwise inference.
We show that adaptive decision trees can fail to achieve convergence rates of convergence in the norm with non-vanishing probability.
We show that random forests can remedy the situation, turning poor performing trees into nearly optimal procedures.
arXiv Detail & Related papers (2022-11-19T21:28:30Z) - Robustifying Algorithms of Learning Latent Trees with Vector Variables [92.18777020401484]
We present the sample complexities of Recursive Grouping (RG) and Chow-Liu Recursive Grouping (CLRG)
We robustify RG, CLRG, Neighbor Joining (NJ) and Spectral NJ (SNJ) by using the truncated inner product.
We derive the first known instance-dependent impossibility result for structure learning of latent trees.
arXiv Detail & Related papers (2021-06-02T01:37:52Z) - Large Scale Prediction with Decision Trees [9.917147243076645]
This paper shows that decision trees constructed with Classification and Regression Trees (CART) and C4.5 methodology are consistent for regression and classification tasks.
A key step in the analysis is the establishment of an oracle inequality, which allows for a precise characterization of the goodness-of-fit and complexity tradeoff for a mis-specified model.
arXiv Detail & Related papers (2021-04-28T16:59:03Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Convex Polytope Trees [57.56078843831244]
convex polytope trees (CPT) are proposed to expand the family of decision trees by an interpretable generalization of their decision boundary.
We develop a greedy method to efficiently construct CPT and scalable end-to-end training algorithms for the tree parameters when the tree structure is given.
arXiv Detail & Related papers (2020-10-21T19:38:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.