Ensemble Learning with Statistical and Structural Models
- URL: http://arxiv.org/abs/2006.05308v1
- Date: Sun, 7 Jun 2020 13:36:50 GMT
- Title: Ensemble Learning with Statistical and Structural Models
- Authors: Jiaming Mao, Jingzhi Xu
- Abstract summary: We propose a set of novel methods for combining statistical and structural models for improved prediction and causal inference.
Our first proposed estimator has the doubly robustness property in that it only requires the correct specification of either the statistical or the structural model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Statistical and structural modeling represent two distinct approaches to data
analysis. In this paper, we propose a set of novel methods for combining
statistical and structural models for improved prediction and causal inference.
Our first proposed estimator has the doubly robustness property in that it only
requires the correct specification of either the statistical or the structural
model. Our second proposed estimator is a weighted ensemble that has the
ability to outperform both models when they are both misspecified. Experiments
demonstrate the potential of our estimators in various settings, including
fist-price auctions, dynamic models of entry and exit, and demand estimation
with instrumental variables.
Related papers
- Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Stability of clinical prediction models developed using statistical or
machine learning methods [0.5482532589225552]
Clinical prediction models estimate an individual's risk of a particular health outcome, conditional on their values of multiple predictors.
Many models are developed using small datasets that lead to instability in the model and its predictions (estimated risks)
We show instability in a model's estimated risks is often considerable, and manifests itself as miscalibration of predictions in new data.
arXiv Detail & Related papers (2022-11-02T11:55:28Z) - On generative models as the basis for digital twins [0.0]
A framework is proposed for generative models as a basis for digital twins or mirrors of structures.
The proposal is based on the premise that deterministic models cannot account for the uncertainty present in most structural modelling applications.
arXiv Detail & Related papers (2022-03-08T20:34:56Z) - Causality and Generalizability: Identifiability and Learning Methods [0.0]
This thesis contributes to the research areas concerning the estimation of causal effects, causal structure learning, and distributionally robust prediction methods.
We present novel and consistent linear and non-linear causal effects estimators in instrumental variable settings that employ data-dependent mean squared prediction error regularization.
We propose a general framework for distributional robustness with respect to intervention-induced distributions.
arXiv Detail & Related papers (2021-10-04T13:12:11Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - How Faithful is your Synthetic Data? Sample-level Metrics for Evaluating
and Auditing Generative Models [95.8037674226622]
We introduce a 3-dimensional evaluation metric that characterizes the fidelity, diversity and generalization performance of any generative model in a domain-agnostic fashion.
Our metric unifies statistical divergence measures with precision-recall analysis, enabling sample- and distribution-level diagnoses of model fidelity and diversity.
arXiv Detail & Related papers (2021-02-17T18:25:30Z) - Characterizing Fairness Over the Set of Good Models Under Selective
Labels [69.64662540443162]
We develop a framework for characterizing predictive fairness properties over the set of models that deliver similar overall performance.
We provide tractable algorithms to compute the range of attainable group-level predictive disparities.
We extend our framework to address the empirically relevant challenge of selectively labelled data.
arXiv Detail & Related papers (2021-01-02T02:11:37Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z) - Structural Regularization [0.0]
We propose a novel method for modeling data by using structural models based on economic theory as regularizers for statistical models.
We show that our method can outperform both the (misspecified) structural model and un-structural-regularized statistical models.
arXiv Detail & Related papers (2020-04-27T06:47:07Z) - Machine learning for causal inference: on the use of cross-fit
estimators [77.34726150561087]
Doubly-robust cross-fit estimators have been proposed to yield better statistical properties.
We conducted a simulation study to assess the performance of several estimators for the average causal effect (ACE)
When used with machine learning, the doubly-robust cross-fit estimators substantially outperformed all of the other estimators in terms of bias, variance, and confidence interval coverage.
arXiv Detail & Related papers (2020-04-21T23:09:55Z) - Semi-Structured Distributional Regression -- Extending Structured
Additive Models by Arbitrary Deep Neural Networks and Data Modalities [0.0]
We propose a general framework to combine structured regression models and deep neural networks into a unifying network architecture.
We demonstrate the framework's efficacy in numerical experiments and illustrate its special merits in benchmarks and real-world applications.
arXiv Detail & Related papers (2020-02-13T21:01:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.