Learning Sparse Graphs via Majorization-Minimization for Smooth Node
Signals
- URL: http://arxiv.org/abs/2202.02815v1
- Date: Sun, 6 Feb 2022 17:06:13 GMT
- Title: Learning Sparse Graphs via Majorization-Minimization for Smooth Node
Signals
- Authors: Ghania Fatima, Aakash Arora, Prabhu Babu, and Petre Stoica
- Abstract summary: We propose an algorithm for learning a sparse weighted graph by estimating its adjacency matrix.
We show that the proposed algorithm converges faster, in terms of the average number of iterations, than several existing methods in the literature.
- Score: 8.140698535149042
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this letter, we propose an algorithm for learning a sparse weighted graph
by estimating its adjacency matrix under the assumption that the observed
signals vary smoothly over the nodes of the graph. The proposed algorithm is
based on the principle of majorization-minimization (MM), wherein we first
obtain a tight surrogate function for the graph learning objective and then
solve the resultant surrogate problem which has a simple closed form solution.
The proposed algorithm does not require tuning of any hyperparameter and it has
the desirable feature of eliminating the inactive variables in the course of
the iterations - which can help speeding up the algorithm. The numerical
simulations conducted using both synthetic and real world (brain-network) data
show that the proposed algorithm converges faster, in terms of the average
number of iterations, than several existing methods in the literature.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Optimality of Approximate Message Passing Algorithms for Spiked Matrix Models with Rotationally Invariant Noise [12.64438771302935]
We study the problem of estimating a rank one signal matrix from an observed matrix generated by corrupting the signal with additive rotationally invariant noise.
We develop a new class of approximate message-passing algorithms for this problem and provide a simple and concise characterization of their dynamics in the high-dimensional limit.
arXiv Detail & Related papers (2024-05-28T11:42:51Z) - Learning the Positions in CountSketch [49.57951567374372]
We consider sketching algorithms which first compress data by multiplication with a random sketch matrix, and then apply the sketch to quickly solve an optimization problem.
In this work, we propose the first learning-based algorithms that also optimize the locations of the non-zero entries.
arXiv Detail & Related papers (2023-06-11T07:28:35Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Alternating minimization algorithm with initialization analysis for
r-local and k-sparse unlabeled sensing [10.751269349279912]
Unlabeled sensing problem is to recover an unknown signal from permuted linear measurements.
We propose an alternating minimization algorithm with a suitable initialization for the widely considered k-sparse permutation model.
Our algorithm is computationally scalable and, compared to baseline methods, achieves superior performance on real and synthetic datasets.
arXiv Detail & Related papers (2022-11-14T18:44:55Z) - Accelerated Graph Learning from Smooth Signals [10.426964757656743]
A fast dual-based proximal gradient algorithm is developed to tackle a strongly convex, smoothness-regularized network inverse problem.
Unlike existing solvers, the novel iterations come with global convergence rate guarantees and do not require additional step-size tuning.
arXiv Detail & Related papers (2021-10-19T01:02:57Z) - Continuation Newton methods with deflation techniques for global
optimization problems [3.705839280172101]
A global minimum point of an optimization problem is of interest in engineering.
In this article, we consider a new memetic algorithm for this nonlinear largescale problem.
According to our numerical experiments, new algorithm works well for unconstrained unconstrained problems.
arXiv Detail & Related papers (2021-07-29T09:53:49Z) - Graph Signal Restoration Using Nested Deep Algorithm Unrolling [85.53158261016331]
Graph signal processing is a ubiquitous task in many applications such as sensor, social transportation brain networks, point cloud processing, and graph networks.
We propose two restoration methods based on convexindependent deep ADMM (ADMM)
parameters in the proposed restoration methods are trainable in an end-to-end manner.
arXiv Detail & Related papers (2021-06-30T08:57:01Z) - Single-Timescale Stochastic Nonconvex-Concave Optimization for Smooth
Nonlinear TD Learning [145.54544979467872]
We propose two single-timescale single-loop algorithms that require only one data point each step.
Our results are expressed in a form of simultaneous primal and dual side convergence.
arXiv Detail & Related papers (2020-08-23T20:36:49Z) - Accelerated Message Passing for Entropy-Regularized MAP Inference [89.15658822319928]
Maximum a posteriori (MAP) inference in discrete-valued random fields is a fundamental problem in machine learning.
Due to the difficulty of this problem, linear programming (LP) relaxations are commonly used to derive specialized message passing algorithms.
We present randomized methods for accelerating these algorithms by leveraging techniques that underlie classical accelerated gradient.
arXiv Detail & Related papers (2020-07-01T18:43:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.