Tensor Generalized Canonical Correlation Analysis
- URL: http://arxiv.org/abs/2302.05277v1
- Date: Fri, 10 Feb 2023 14:41:12 GMT
- Title: Tensor Generalized Canonical Correlation Analysis
- Authors: Fabien Girka, Arnaud Gloaguen, Laurent Le Brusquet, Violetta Zujovic,
Arthur Tenenhaus
- Abstract summary: Generalized Generalized Canonical Correlation Analysis (RGCCA) is a general statistical framework for multi-block data analysis.
This paper presents TGCCA, a new method for analyzing higher-order tensors with admitting an canonical rank-R decomposition.
The efficiency and usefulness of TGCCA are evaluated on simulated and real data and compared favorably to state-of-the-art approaches.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Regularized Generalized Canonical Correlation Analysis (RGCCA) is a general
statistical framework for multi-block data analysis. RGCCA enables deciphering
relationships between several sets of variables and subsumes many well-known
multivariate analysis methods as special cases. However, RGCCA only deals with
vector-valued blocks, disregarding their possible higher-order structures. This
paper presents Tensor GCCA (TGCCA), a new method for analyzing higher-order
tensors with canonical vectors admitting an orthogonal rank-R CP decomposition.
Moreover, two algorithms for TGCCA, based on whether a separable covariance
structure is imposed or not, are presented along with convergence guarantees.
The efficiency and usefulness of TGCCA are evaluated on simulated and real data
and compared favorably to state-of-the-art approaches.
Related papers
- High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization [83.06112052443233]
This paper studies kernel ridge regression in high dimensions under covariate shifts.
By a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for decreasing the variance.
For bias, we analyze the regularization of the arbitrary or well-chosen scale, showing that the bias can behave very differently under different regularization scales.
arXiv Detail & Related papers (2024-06-05T12:03:27Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Perfect Spectral Clustering with Discrete Covariates [68.8204255655161]
We propose a spectral algorithm that achieves perfect clustering with high probability on a class of large, sparse networks.
Our method is the first to offer a guarantee of consistent latent structure recovery using spectral clustering.
arXiv Detail & Related papers (2022-05-17T01:41:06Z) - Benign Overfitting of Constant-Stepsize SGD for Linear Regression [122.70478935214128]
inductive biases are central in preventing overfitting empirically.
This work considers this issue in arguably the most basic setting: constant-stepsize SGD for linear regression.
We reflect on a number of notable differences between the algorithmic regularization afforded by (unregularized) SGD in comparison to ordinary least squares.
arXiv Detail & Related papers (2021-03-23T17:15:53Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Multilinear Common Component Analysis via Kronecker Product
Representation [0.0]
We consider the problem of extracting a common structure from multiple tensor datasets.
We propose multilinear common component analysis (MCCA) based on Kronecker products of mode-wise covariance matrices.
We develop an estimation algorithm for MCCA that guarantees mode-wise global convergence.
arXiv Detail & Related papers (2020-09-06T10:03:17Z) - Approximation Algorithms for Sparse Principal Component Analysis [57.5357874512594]
Principal component analysis (PCA) is a widely used dimension reduction technique in machine learning and statistics.
Various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis.
We present thresholding as a provably accurate, time, approximation algorithm for the SPCA problem.
arXiv Detail & Related papers (2020-06-23T04:25:36Z) - Probabilistic Canonical Correlation Analysis for Sparse Count Data [3.1753001245931323]
Canonical correlation analysis is an important technique for exploring the relationship between two sets of continuous variables.
We propose a model-based probabilistic approach for correlation and canonical correlation estimation for two sparse count data sets.
arXiv Detail & Related papers (2020-05-11T02:19:57Z) - Sparse Generalized Canonical Correlation Analysis: Distributed
Alternating Iteration based Approach [18.93565942407577]
Sparse canonical correlation analysis (CCA) is a useful statistical tool to detect latent information with sparse structures.
We propose a generalized canonical correlation analysis (GCCA), which could detect the latent relations of multiview data with sparse structures.
arXiv Detail & Related papers (2020-04-23T05:53:48Z) - An Inexact Manifold Augmented Lagrangian Method for Adaptive Sparse
Canonical Correlation Analysis with Trace Lasso Regularization [1.2335698325757491]
Canonical correlation analysis (CCA) describes the relationship between two sets of variables.
In high-dimensional settings where the number of variables exceeds sample size, or in the case of that the variables are highly correlated, the traditional CCA is no longer appropriate.
An adaptive sparse version of CCA (ASCCA) is proposed by using the trace Lasso regularization.
arXiv Detail & Related papers (2020-03-20T10:57:01Z) - D-GCCA: Decomposition-based Generalized Canonical Correlation Analysis
for Multi-view High-dimensional Data [11.184915338554422]
A popular model in high-dimensional multi-view data analysis decomposes each view's data matrix into a low-rank common-source matrix generated by latent factors common across all data views.
We propose a novel decomposition method for this model, called decomposition-based generalized canonical correlation analysis (D-GCCA)
Our D-GCCA takes one step further than generalized canonical correlation analysis by separating common and distinctive components among canonical variables.
arXiv Detail & Related papers (2020-01-09T06:35:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.