Riemannian Optimization for Variance Estimation in Linear Mixed Models
- URL: http://arxiv.org/abs/2212.09081v1
- Date: Sun, 18 Dec 2022 13:08:45 GMT
- Title: Riemannian Optimization for Variance Estimation in Linear Mixed Models
- Authors: Lena Sembach, Jan Pablo Burgard, Volker H. Schulz
- Abstract summary: We take a completely novel view on parameter estimation in linear mixed models by exploiting the intrinsic geometry of the parameter space.
Our approach yields a higher quality of the variance parameter estimates compared to existing approaches.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Variance parameter estimation in linear mixed models is a challenge for many
classical nonlinear optimization algorithms due to the positive-definiteness
constraint of the random effects covariance matrix. We take a completely novel
view on parameter estimation in linear mixed models by exploiting the intrinsic
geometry of the parameter space. We formulate the problem of residual maximum
likelihood estimation as an optimization problem on a Riemannian manifold.
Based on the introduced formulation, we give geometric higher-order information
on the problem via the Riemannian gradient and the Riemannian Hessian. Based on
that, we test our approach with Riemannian optimization algorithms numerically.
Our approach yields a higher quality of the variance parameter estimates
compared to existing approaches.
Related papers
- Symmetry-informed transferability of optimal parameters in the Quantum Approximate Optimization Algorithm [0.0]
We show how to translate an arbitrary set of optimal parameters into an adequate domain using the symmetries.
We extend these results to general classical optimization problems described by Isatzing Hamiltonian variational ansatz for relevant physical models.
arXiv Detail & Related papers (2024-07-05T13:37:53Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Stochastic Mirror Descent for Large-Scale Sparse Recovery [13.500750042707407]
We discuss an application of quadratic Approximation to statistical estimation of high-dimensional sparse parameters.
We show that the proposed algorithm attains the optimal convergence of the estimation error under weak assumptions on the regressor distribution.
arXiv Detail & Related papers (2022-10-23T23:23:23Z) - Noise Estimation in Gaussian Process Regression [1.5002438468152661]
The presented method can be used to estimate the variance of the correlated error, and the variance of the noise based on maximizing a marginal likelihood function.
We demonstrate the computational advantages and robustness of the presented approach compared to traditional parameter optimization.
arXiv Detail & Related papers (2022-06-20T19:36:03Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - On Riemannian Approach for Constrained Optimization Model in Extreme
Classification Problems [2.7436792484073638]
A constrained optimization problem is formulated as an optimization problem on matrix manifold.
The proposed approach is tested on several real world large scale multi-label datasets.
arXiv Detail & Related papers (2021-09-30T11:28:35Z) - Implicit differentiation for fast hyperparameter selection in non-smooth
convex learning [87.60600646105696]
We study first-order methods when the inner optimization problem is convex but non-smooth.
We show that the forward-mode differentiation of proximal gradient descent and proximal coordinate descent yield sequences of Jacobians converging toward the exact Jacobian.
arXiv Detail & Related papers (2021-05-04T17:31:28Z) - Support recovery and sup-norm convergence rates for sparse pivotal
estimation [79.13844065776928]
In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level.
We show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators.
arXiv Detail & Related papers (2020-01-15T16:11:04Z) - Geometry, Computation, and Optimality in Stochastic Optimization [24.154336772159745]
We study computational and statistical consequences of problem geometry in and online optimization.
By focusing on constraint set and gradient geometry, we characterize the problem families for which- and adaptive-gradient methods are (minimax) optimal.
arXiv Detail & Related papers (2019-09-23T16:14:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.