On Consistency and Asymptotic Normality of Least Absolute Deviation
Estimators for 2-dimensional Sinusoidal Model
- URL: http://arxiv.org/abs/2301.03229v2
- Date: Thu, 15 Jun 2023 20:46:53 GMT
- Title: On Consistency and Asymptotic Normality of Least Absolute Deviation
Estimators for 2-dimensional Sinusoidal Model
- Authors: Saptarshi Roy, Amit Mitra and N K Archak
- Abstract summary: We propose a robust least absolute deviation (LAD) estimators for parameter estimation.
We establish the strong consistency and normality of the LAD estimators of the signal parameters of a 2-dimensional sinusoidal model.
Data analysis of a 2-dimensional texture data indicates practical applicability of the proposed LAD approach.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Estimation of the parameters of a 2-dimensional sinusoidal model is a
fundamental problem in digital signal processing and time series analysis. In
this paper, we propose a robust least absolute deviation (LAD) estimators for
parameter estimation. The proposed methodology provides a robust alternative to
non-robust estimation techniques like the least squares estimators, in
situations where outliers are present in the data or in the presence of heavy
tailed noise. We study important asymptotic properties of the LAD estimators
and establish the strong consistency and asymptotic normality of the LAD
estimators of the signal parameters of a 2-dimensional sinusoidal model. We
further illustrate the advantage of using LAD estimators over least squares
estimators through extensive simulation studies. Data analysis of a
2-dimensional texture data indicates practical applicability of the proposed
LAD approach.
Related papers
- Distributionally Robust Optimization as a Scalable Framework to Characterize Extreme Value Distributions [22.765095010254118]
The goal of this paper is to develop distributionally robust optimization (DRO) estimators, specifically for multidimensional Extreme Value Theory (EVT) statistics.
In order to mitigate over-conservative estimates while enhancing out-of-sample performance, we study DRO estimators informed by semi-parametric max-stable constraints in the space of point processes.
Both approaches are validated using synthetically generated data, recovering prescribed characteristics, and verifying the efficacy of the proposed techniques.
arXiv Detail & Related papers (2024-07-31T19:45:27Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Robust Rayleigh Regression Method for SAR Image Processing in Presence
of Outliers [0.0]
The presence of outliers (anomalous values) in synthetic aperture radar (SAR) data may result in inaccurate inferences.
This paper aims at obtaining Rayleigh regression model parameter estimators robust to the presence of outliers.
arXiv Detail & Related papers (2022-07-29T23:03:45Z) - Non-Iterative Recovery from Nonlinear Observations using Generative
Models [14.772379476524407]
We assume that the signal lies in the range of an $L$-Lipschitz continuous generative model with bounded $k$-dimensional inputs.
Our reconstruction method is non-iterative (though approximating the projection step may use an iterative procedure) and highly efficient.
arXiv Detail & Related papers (2022-05-31T12:34:40Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Deep Dimension Reduction for Supervised Representation Learning [51.10448064423656]
We propose a deep dimension reduction approach to learning representations with essential characteristics.
The proposed approach is a nonparametric generalization of the sufficient dimension reduction method.
We show that the estimated deep nonparametric representation is consistent in the sense that its excess risk converges to zero.
arXiv Detail & Related papers (2020-06-10T14:47:43Z) - Path Sample-Analytic Gradient Estimators for Stochastic Binary Networks [78.76880041670904]
In neural networks with binary activations and or binary weights the training by gradient descent is complicated.
We propose a new method for this estimation problem combining sampling and analytic approximation steps.
We experimentally show higher accuracy in gradient estimation and demonstrate a more stable and better performing training in deep convolutional models.
arXiv Detail & Related papers (2020-06-04T21:51:21Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z) - Bayesian System ID: Optimal management of parameter, model, and
measurement uncertainty [0.0]
We evaluate the robustness of a probabilistic formulation of system identification (ID) to sparse, noisy, and indirect data.
We show that the log posterior has improved geometric properties compared with the objective function surfaces of traditional methods.
arXiv Detail & Related papers (2020-03-04T22:48:30Z) - Statistical Inference for Model Parameters in Stochastic Gradient
Descent [45.29532403359099]
gradient descent coefficients (SGD) has been widely used in statistical estimation for large-scale data due to its computational and memory efficiency.
We investigate the problem of statistical inference of true model parameters based on SGD when the population loss function is strongly convex and satisfies certain conditions.
arXiv Detail & Related papers (2016-10-27T07:04:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.