Machine-Learning Kronecker Coefficients
- URL: http://arxiv.org/abs/2306.04734v1
- Date: Wed, 7 Jun 2023 19:10:44 GMT
- Title: Machine-Learning Kronecker Coefficients
- Authors: Kyu-Hwan Lee
- Abstract summary: We show that standard machine-learning algorithms may be trained to predict whether a given Kronecker coefficient is zero or not.
Our results show that a trained machine can efficiently perform this binary classification with high accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Kronecker coefficients are the decomposition multiplicities of the tensor
product of two irreducible representations of the symmetric group. Unlike the
Littlewood--Richardson coefficients, which are the analogues for the general
linear group, there is no known combinatorial description of the Kronecker
coefficients, and it is an NP-hard problem to decide whether a given Kronecker
coefficient is zero or not. In this paper, we show that standard
machine-learning algorithms such as Nearest Neighbors, Convolutional Neural
Networks and Gradient Boosting Decision Trees may be trained to predict whether
a given Kronecker coefficient is zero or not. Our results show that a trained
machine can efficiently perform this binary classification with high accuracy
($\approx 0.98$).
Related papers
- Quantum Algorithms for Representation-Theoretic Multiplicities [0.0]
We give quantum algorithms for computing Kostka, Littlewood-Richardson, Plethysm and Kronecker coefficients.
We show that there is an efficient classical algorithm for computing the Littlewood-Richardson coefficients.
We conjecture that our quantum algorithms lead to superpolynomial speedups.
arXiv Detail & Related papers (2024-07-24T21:34:05Z) - Limits to classification performance by relating Kullback-Leibler
divergence to Cohen's Kappa [0.0]
Theory and methods are discussed in detail and then applied to Monte Carlo data and real datasets.
In all cases this analysis shows that the algorithms could not have performed any better due to the underlying probability density functions for the two classes.
arXiv Detail & Related papers (2024-03-03T17:36:42Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - A remark on the quantum complexity of the Kronecker coefficients [1.6498361958317633]
We prove that the computation of the Kronecker coefficients of the symmetric group is contained in the complexity class #BQP.
This improves a recent result of Bravyi, Chowdhury, Gosset, Havlicek, and Zhu.
arXiv Detail & Related papers (2023-07-05T15:57:53Z) - Quantum complexity of the Kronecker coefficients [2.2983894078587963]
We show that a given Kronecker coefficient counts the dimension of the vector space spanned by the accepting witnesses of a QMA verifier.
This implies that approximating the Kronecker coefficients to within a given relative error is not harder than a certain natural class of quantum approximate counting problems.
arXiv Detail & Related papers (2023-02-22T15:43:26Z) - Understanding Interlocking Dynamics of Cooperative Rationalization [90.6863969334526]
Selective rationalization explains the prediction of complex neural networks by finding a small subset of the input that is sufficient to predict the neural model output.
We reveal a major problem with such cooperative rationalization paradigm -- model interlocking.
We propose a new rationalization framework, called A2R, which introduces a third component into the architecture, a predictor driven by soft attention as opposed to selection.
arXiv Detail & Related papers (2021-10-26T17:39:18Z) - Permutation Compressors for Provably Faster Distributed Nonconvex
Optimization [68.8204255655161]
We show that the MARINA method of Gorbunov et al (2021) can be considered as a state-of-the-art method in terms of theoretical communication complexity.
Theory of MARINA to support the theory of potentially em correlated compressors, extends to the method beyond the classical independent compressors setting.
arXiv Detail & Related papers (2021-10-07T09:38:15Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Dominant Z-Eigenpairs of Tensor Kronecker Products are Decoupled and
Applications to Higher-Order Graph Matching [0.0]
We present a theorem that decouples the dominant eigenvectors of tensor Kronecker products.
We investigate low rank structure in the network alignment algorithm TAME.
arXiv Detail & Related papers (2020-11-17T18:55:48Z) - High-Dimensional Quadratic Discriminant Analysis under Spiked Covariance
Model [101.74172837046382]
We propose a novel quadratic classification technique, the parameters of which are chosen such that the fisher-discriminant ratio is maximized.
Numerical simulations show that the proposed classifier not only outperforms the classical R-QDA for both synthetic and real data but also requires lower computational complexity.
arXiv Detail & Related papers (2020-06-25T12:00:26Z) - Robustly Learning any Clusterable Mixture of Gaussians [55.41573600814391]
We study the efficient learnability of high-dimensional Gaussian mixtures in the adversarial-robust setting.
We provide an algorithm that learns the components of an $epsilon$-corrupted $k$-mixture within information theoretically near-optimal error proofs of $tildeO(epsilon)$.
Our main technical contribution is a new robust identifiability proof clusters from a Gaussian mixture, which can be captured by the constant-degree Sum of Squares proof system.
arXiv Detail & Related papers (2020-05-13T16:44:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.