Checking the Sufficiently Scattered Condition using a Global Non-Convex
Optimization Software
- URL: http://arxiv.org/abs/2402.06019v1
- Date: Thu, 8 Feb 2024 19:41:38 GMT
- Title: Checking the Sufficiently Scattered Condition using a Global Non-Convex
Optimization Software
- Authors: Nicolas Gillis, Robert Luce
- Abstract summary: The sufficiently scattered condition (SSC) is a key condition in the identifiability of various matrix factorization problems.
We show that it can be checked in a reasonable amount of time in realistic scenarios.
- Score: 16.556378176193032
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The sufficiently scattered condition (SSC) is a key condition in the study of
identifiability of various matrix factorization problems, including
nonnegative, minimum-volume, symmetric, simplex-structured, and polytopic
matrix factorizations. The SSC allows one to guarantee that the computed matrix
factorization is unique/identifiable, up to trivial ambiguities. However, this
condition is NP-hard to check in general. In this paper, we show that it can
however be checked in a reasonable amount of time in realistic scenarios, when
the factorization rank is not too large. This is achieved by formulating the
problem as a non-convex quadratic optimization problem over a bounded set. We
use the global non-convex optimization software Gurobi, and showcase the
usefulness of this code on synthetic data sets and on real-world hyperspectral
images.
Related papers
- Decentralized Sum-of-Nonconvex Optimization [42.04181488477227]
We consider the optimization problem of the sum-of-non function, i.e., a guarantee function that is the average non-consensus number.
We propose an accelerated decentralized first-order algorithm by techniques of gradient, tracking into the rate, and multi-consensus.
arXiv Detail & Related papers (2024-02-04T05:48:45Z) - On the Global Solution of Soft k-Means [159.23423824953412]
This paper presents an algorithm to solve the Soft k-Means problem globally.
A new model, named Minimal Volume Soft kMeans (MVSkM), is proposed to address solutions non-uniqueness issue.
arXiv Detail & Related papers (2022-12-07T12:06:55Z) - Exploring the Algorithm-Dependent Generalization of AUPRC Optimization
with List Stability [107.65337427333064]
optimization of the Area Under the Precision-Recall Curve (AUPRC) is a crucial problem for machine learning.
In this work, we present the first trial in the single-dependent generalization of AUPRC optimization.
Experiments on three image retrieval datasets on speak to the effectiveness and soundness of our framework.
arXiv Detail & Related papers (2022-09-27T09:06:37Z) - Sparse Polynomial Optimization: Theory and Practice [5.27013884159732]
Book presents several efforts to tackle this challenge with important scientific implications.
It provides alternative optimization schemes that scale well in terms of computational complexity.
We present sparsity-exploiting hierarchies of relaxations, for either unconstrained or constrained problems.
arXiv Detail & Related papers (2022-08-23T18:56:05Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - DAGs with No Fears: A Closer Look at Continuous Optimization for
Learning Bayesian Networks [45.3591788771536]
We re-examine a continuous optimization framework dubbed NOTEARS for learning Bayesian networks.
We show that the Karush-Kuhn-Tucker optimality conditions for the NOTEARS cannot be satisfied except in a trivial case.
Some combinations with local search are both more accurate and more efficient than the original NOTEARS.
arXiv Detail & Related papers (2020-10-18T22:59:37Z) - Simplex-Structured Matrix Factorization: Sparsity-based Identifiability
and Provably Correct Algorithms [21.737226432466496]
We provide novel algorithms with identifiability guarantees for simplex-structured matrix factorization.
We illustrate the effectiveness of our approach on synthetic data sets and hyperspectral images.
arXiv Detail & Related papers (2020-07-22T14:01:58Z) - On the Sample Complexity and Optimization Landscape for Quadratic
Feasibility Problems [7.722592882475735]
We consider the problem of recovering a complex vector $mathbfxin mathbbCn$ from $mangle A-imathbfx, mathbfxr_i=1m arbitrary.
In general, not only is the the quadratic problem NP-hard to solve, but it may in fact be unidentifiable.
arXiv Detail & Related papers (2020-02-04T00:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.