DynFrs: An Efficient Framework for Machine Unlearning in Random Forest
- URL: http://arxiv.org/abs/2410.01588v1
- Date: Wed, 2 Oct 2024 14:20:30 GMT
- Title: DynFrs: An Efficient Framework for Machine Unlearning in Random Forest
- Authors: Shurong Wang, Zhuoyang Shen, Xinbao Qiao, Tongning Zhang, Meng Zhang,
- Abstract summary: DynFrs is a framework designed to enable efficient machine unlearning in Random Forests.
In experiments, applying Dynfrs on Extremely Trees yields substantial improvements.
- Score: 2.315324942451179
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Random Forests are widely recognized for establishing efficacy in classification and regression tasks, standing out in various domains such as medical diagnosis, finance, and personalized recommendations. These domains, however, are inherently sensitive to privacy concerns, as personal and confidential data are involved. With increasing demand for the right to be forgotten, particularly under regulations such as GDPR and CCPA, the ability to perform machine unlearning has become crucial for Random Forests. However, insufficient attention was paid to this topic, and existing approaches face difficulties in being applied to real-world scenarios. Addressing this gap, we propose the DynFrs framework designed to enable efficient machine unlearning in Random Forests while preserving predictive accuracy. Dynfrs leverages subsampling method Occ(q) and a lazy tag strategy Lzy, and is still adaptable to any Random Forest variant. In essence, Occ(q) ensures that each sample in the training set occurs only in a proportion of trees so that the impact of deleting samples is limited, and Lzy delays the reconstruction of a tree node until necessary, thereby avoiding unnecessary modifications on tree structures. In experiments, applying Dynfrs on Extremely Randomized Trees yields substantial improvements, achieving orders of magnitude faster unlearning performance and better predictive accuracy than existing machine unlearning methods for Random Forests.
Related papers
- A New Random Forest Ensemble of Intuitionistic Fuzzy Decision Trees [5.831659043074847]
We propose a new random forest ensemble of intuitionistic fuzzy decision trees (IFDT)
The proposed method enjoys the power of the randomness from bootstrapped sampling and feature selection.
This study is the first to propose a random forest ensemble based on the intuitionistic fuzzy theory.
arXiv Detail & Related papers (2024-03-12T06:52:24Z) - Adaptive Split Balancing for Optimal Random Forest [8.916614661563893]
We propose a new random forest algorithm that constructs the trees using a novel adaptive split-balancing method.
Our method achieves optimality in simple, smooth scenarios while adaptively learning the tree structure from the data.
arXiv Detail & Related papers (2024-02-17T09:10:40Z) - Why do Random Forests Work? Understanding Tree Ensembles as
Self-Regularizing Adaptive Smoothers [68.76846801719095]
We argue that the current high-level dichotomy into bias- and variance-reduction prevalent in statistics is insufficient to understand tree ensembles.
We show that forests can improve upon trees by three distinct mechanisms that are usually implicitly entangled.
arXiv Detail & Related papers (2024-02-02T15:36:43Z) - Tree-Planner: Efficient Close-loop Task Planning with Large Language Models [63.06270302774049]
Tree-Planner reframes task planning with Large Language Models into three distinct phases.
Tree-Planner achieves state-of-the-art performance while maintaining high efficiency.
arXiv Detail & Related papers (2023-10-12T17:59:50Z) - ForensicsForest Family: A Series of Multi-scale Hierarchical Cascade Forests for Detecting GAN-generated Faces [53.739014757621376]
We describe a simple and effective forest-based method set called em ForensicsForest Family to detect GAN-generate faces.
ForenscisForest is a newly proposed Multi-scale Hierarchical Cascade Forest.
Hybrid ForensicsForest integrates the CNN layers into models.
Divide-and-Conquer ForensicsForest can construct a forest model using only a portion of training samplings.
arXiv Detail & Related papers (2023-08-02T06:41:19Z) - The Cascaded Forward Algorithm for Neural Network Training [61.06444586991505]
We propose a new learning framework for neural networks, namely Cascaded Forward (CaFo) algorithm, which does not rely on BP optimization as that in FF.
Unlike FF, our framework directly outputs label distributions at each cascaded block, which does not require generation of additional negative samples.
In our framework each block can be trained independently, so it can be easily deployed into parallel acceleration systems.
arXiv Detail & Related papers (2023-03-17T02:01:11Z) - Growing Deep Forests Efficiently with Soft Routing and Learned
Connectivity [79.83903179393164]
This paper further extends the deep forest idea in several important aspects.
We employ a probabilistic tree whose nodes make probabilistic routing decisions, a.k.a., soft routing, rather than hard binary decisions.
Experiments on the MNIST dataset demonstrate that our empowered deep forests can achieve better or comparable performance than [1],[3].
arXiv Detail & Related papers (2020-12-29T18:05:05Z) - Residual Likelihood Forests [19.97069303172077]
This paper presents a novel ensemble learning approach called Residual Likelihood Forests (RLF)
Our weak learners produce conditional likelihoods that are sequentially optimized using global loss in the context of previous learners.
When compared against several ensemble approaches including Random Forests and Gradient Boosted Trees, RLFs offer a significant improvement in performance.
arXiv Detail & Related papers (2020-11-04T00:59:41Z) - Improved Weighted Random Forest for Classification Problems [3.42658286826597]
The key to make well-performing ensemble model is in the diversity of the base models.
We propose several algorithms that intend to modify the weighting strategy of regular random forest.
The proposed models are able to introduce significant improvements compared to regular random forest.
arXiv Detail & Related papers (2020-09-01T16:08:45Z) - Bayesian Optimization with Machine Learning Algorithms Towards Anomaly
Detection [66.05992706105224]
In this paper, an effective anomaly detection framework is proposed utilizing Bayesian Optimization technique.
The performance of the considered algorithms is evaluated using the ISCX 2012 dataset.
Experimental results show the effectiveness of the proposed framework in term of accuracy rate, precision, low-false alarm rate, and recall.
arXiv Detail & Related papers (2020-08-05T19:29:35Z) - Towards Robust Classification with Deep Generative Forests [13.096855747795303]
Decision Trees and Random Forests are among the most widely used machine learning models.
Being primarily discriminative models they lack principled methods to manipulate the uncertainty of predictions.
We exploit Generative Forests (GeFs) to extend Random Forests to generative models representing the full joint distribution over the feature space.
arXiv Detail & Related papers (2020-07-11T08:57:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.