Uniform Consistency in Nonparametric Mixture Models
- URL: http://arxiv.org/abs/2108.14003v1
- Date: Tue, 31 Aug 2021 17:53:52 GMT
- Title: Uniform Consistency in Nonparametric Mixture Models
- Authors: Bryon Aragam and Ruiyi Yang
- Abstract summary: We study uniform consistency in nonparametric mixture models and mixed regression models.
In the case of mixed regression, we prove $L1$ convergence of the regression functions while allowing for the component regression functions to intersect arbitrarily often.
- Score: 12.382836502781258
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study uniform consistency in nonparametric mixture models as well as
closely related mixture of regression (also known as mixed regression) models,
where the regression functions are allowed to be nonparametric and the error
distributions are assumed to be convolutions of a Gaussian density. We
construct uniformly consistent estimators under general conditions while
simultaneously highlighting several pain points in extending existing pointwise
consistency results to uniform results. The resulting analysis turns out to be
nontrivial, and several novel technical tools are developed along the way. In
the case of mixed regression, we prove $L^1$ convergence of the regression
functions while allowing for the component regression functions to intersect
arbitrarily often, which presents additional technical challenges. We also
consider generalizations to general (i.e. non-convolutional) nonparametric
mixtures.
Related papers
- Transition of $α$-mixing in Random Iterations with Applications in Queuing Theory [0.0]
We show the transfer of mixing properties from the exogenous regressor to the response via coupling arguments.
We also study Markov chains in random environments with drift and minorization conditions, even under non-stationary environments.
arXiv Detail & Related papers (2024-10-07T14:13:37Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Unveiling the Cycloid Trajectory of EM Iterations in Mixed Linear Regression [5.883916678819683]
We study the trajectory of iterations and the convergence rates of the Expectation-Maximization (EM) algorithm for two-component Mixed Linear Regression (2MLR)
Recent results have established the super-linear convergence of EM for 2MLR in the noiseless and high SNR settings.
arXiv Detail & Related papers (2024-05-28T14:46:20Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Strong identifiability and parameter learning in regression with
heterogeneous response [5.503319042839695]
We investigate conditions of strong identifiability, rates of convergence for conditional density and parameter estimation, and the Bayesian posterior contraction behavior arising in finite mixture of regression models.
We provide simulation studies and data illustrations, which shed some light on the parameter learning behavior found in several popular regression mixture models reported in the literature.
arXiv Detail & Related papers (2022-12-08T05:58:13Z) - On Learning Mixture Models with Sparse Parameters [44.3425205248937]
We study mixtures with high dimensional sparse latent parameter vectors and consider the problem of support recovery of those vectors.
We provide efficient algorithms for support recovery that have a logarithmic sample complexity dependence on the dimensionality of the latent space.
arXiv Detail & Related papers (2022-02-24T07:44:23Z) - On the Double Descent of Random Features Models Trained with SGD [78.0918823643911]
We study properties of random features (RF) regression in high dimensions optimized by gradient descent (SGD)
We derive precise non-asymptotic error bounds of RF regression under both constant and adaptive step-size SGD setting.
We observe the double descent phenomenon both theoretically and empirically.
arXiv Detail & Related papers (2021-10-13T17:47:39Z) - Nonlinear Independent Component Analysis for Continuous-Time Signals [85.59763606620938]
We study the classical problem of recovering a multidimensional source process from observations of mixtures of this process.
We show that this recovery is possible for many popular models of processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function.
arXiv Detail & Related papers (2021-02-04T20:28:44Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Kernel Methods for Causal Functions: Dose, Heterogeneous, and
Incremental Response Curves [26.880628841819004]
We prove uniform consistency with improved finite sample rates via original analysis of generalized kernel ridge regression.
We extend our main results to counterfactual distributions and to causal functions identified by front and back door criteria.
arXiv Detail & Related papers (2020-10-10T00:53:11Z) - Consistent Estimation of Identifiable Nonparametric Mixture Models from
Grouped Observations [84.81435917024983]
This work proposes an algorithm that consistently estimates any identifiable mixture model from grouped observations.
A practical implementation is provided for paired observations, and the approach is shown to outperform existing methods.
arXiv Detail & Related papers (2020-06-12T20:44:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.