A Sparsity Inducing Nuclear-Norm Estimator (SpINNEr) for Matrix-Variate
Regression in Brain Connectivity Analysis
- URL: http://arxiv.org/abs/2001.11548v1
- Date: Thu, 30 Jan 2020 20:10:53 GMT
- Title: A Sparsity Inducing Nuclear-Norm Estimator (SpINNEr) for Matrix-Variate
Regression in Brain Connectivity Analysis
- Authors: Damian Brzyski, Xixi Hu, Joaquin Goni, Beau Ances, Timothy W.
Randolph, Jaroslaw Harezlak
- Abstract summary: In medical applications, regressors are often in a form of multi-dimensional arrays.
We present an alternative approach - regularized matrix regression.
SpINNEr is applied to investigate associations between HIV-related outcomes and functional connectivity in the human brain.
- Score: 0.13821857257153802
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Classical scalar-response regression methods treat covariates as a vector and
estimate a corresponding vector of regression coefficients. In medical
applications, however, regressors are often in a form of multi-dimensional
arrays. For example, one may be interested in using MRI imaging to identify
which brain regions are associated with a health outcome. Vectorizing the
two-dimensional image arrays is an unsatisfactory approach since it destroys
the inherent spatial structure of the images and can be computationally
challenging. We present an alternative approach - regularized matrix regression
- where the matrix of regression coefficients is defined as a solution to the
specific optimization problem. The method, called SParsity Inducing Nuclear
Norm EstimatoR (SpINNEr), simultaneously imposes two penalty types on the
regression coefficient matrix---the nuclear norm and the lasso norm---to
encourage a low rank matrix solution that also has entry-wise sparsity. A
specific implementation of the alternating direction method of multipliers
(ADMM) is used to build a fast and efficient numerical solver. Our simulations
show that SpINNEr outperforms other methods in estimation accuracy when the
response-related entries (representing the brain's functional connectivity) are
arranged in well-connected communities. SpINNEr is applied to investigate
associations between HIV-related outcomes and functional connectivity in the
human brain.
Related papers
- Statistical Inference For Noisy Matrix Completion Incorporating Auxiliary Information [3.9748528039819977]
This paper investigates statistical inference for noisy matrix completion in a semi-supervised model.
We apply an iterative least squares (LS) estimation approach in our considered context.
We show that our method only needs a few iterations, and the resulting entry-wise estimators of the low-rank matrix and the coefficient matrix are guaranteed to have normal distributions.
arXiv Detail & Related papers (2024-03-22T01:06:36Z) - Neural incomplete factorization: learning preconditioners for the conjugate gradient method [2.899792823251184]
We develop a data-driven approach to accelerate the generation of effective preconditioners.
We replace the typically hand-engineered preconditioners by the output of graph neural networks.
Our method generates an incomplete factorization of the matrix and is, therefore, referred to as neural incomplete factorization (NeuralIF)
arXiv Detail & Related papers (2023-05-25T11:45:46Z) - Classification of BCI-EEG based on augmented covariance matrix [0.0]
We propose a new framework based on the augmented covariance extracted from an autoregressive model to improve motor imagery classification.
We will test our approach on several datasets and several subjects using the MOABB framework.
arXiv Detail & Related papers (2023-02-09T09:04:25Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Memory-Efficient Backpropagation through Large Linear Layers [107.20037639738433]
In modern neural networks like Transformers, linear layers require significant memory to store activations during backward pass.
This study proposes a memory reduction approach to perform backpropagation through linear layers.
arXiv Detail & Related papers (2022-01-31T13:02:41Z) - Nonparametric Trace Regression in High Dimensions via Sign Series
Representation [13.37650464374017]
We develop a framework for nonparametric trace regression models via structured sign series representations of high dimensional functions.
In the context of matrix completion, our framework leads to a substantially richer model based on what we coin as the "sign rank" of a matrix.
arXiv Detail & Related papers (2021-05-04T22:20:00Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Adversarially-Trained Nonnegative Matrix Factorization [77.34726150561087]
We consider an adversarially-trained version of the nonnegative matrix factorization.
In our formulation, an attacker adds an arbitrary matrix of bounded norm to the given data matrix.
We design efficient algorithms inspired by adversarial training to optimize for dictionary and coefficient matrices.
arXiv Detail & Related papers (2021-04-10T13:13:17Z) - Estimation, Confidence Intervals, and Large-Scale Hypotheses Testing for
High-Dimensional Mixed Linear Regression [9.815103550891463]
This paper studies the high-dimensional mixed linear regression (MLR) where the output variable comes from one of the two linear regression models with an unknown mixing proportion.
We propose an iterative procedure for estimating the two regression vectors and establish their rates of convergence.
A large-scale multiple testing procedure is proposed for testing the regression coefficients and is shown to control the false discovery rate (FDR) algorithmally.
arXiv Detail & Related papers (2020-11-06T21:17:41Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.