Sample Efficient Graph-Based Optimization with Noisy Observations
- URL: http://arxiv.org/abs/2006.02672v1
- Date: Thu, 4 Jun 2020 07:22:28 GMT
- Title: Sample Efficient Graph-Based Optimization with Noisy Observations
- Authors: Tan Nguyen, Ali Shameli, Yasin Abbasi-Yadkori, Anup Rao, Branislav
Kveton
- Abstract summary: We show that a variant of best-arm identification can find a near-optimal solution after a small number of queries independent of the size of the graph.
We show effectiveness of the greedy algorithm with restarts and the simulated annealing on problems of graph-based nearest neighbor classification.
- Score: 17.91308664586981
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study sample complexity of optimizing "hill-climbing friendly" functions
defined on a graph under noisy observations. We define a notion of convexity,
and we show that a variant of best-arm identification can find a near-optimal
solution after a small number of queries that is independent of the size of the
graph. For functions that have local minima and are nearly convex, we show a
sample complexity for the classical simulated annealing under noisy
observations. We show effectiveness of the greedy algorithm with restarts and
the simulated annealing on problems of graph-based nearest neighbor
classification as well as a web document re-ranking application.
Related papers
- Stochastic Zeroth-Order Optimization under Strongly Convexity and Lipschitz Hessian: Minimax Sample Complexity [59.75300530380427]
We consider the problem of optimizing second-order smooth and strongly convex functions where the algorithm is only accessible to noisy evaluations of the objective function it queries.
We provide the first tight characterization for the rate of the minimax simple regret by developing matching upper and lower bounds.
arXiv Detail & Related papers (2024-06-28T02:56:22Z) - Neural Gaussian Similarity Modeling for Differential Graph Structure
Learning [24.582257964387402]
We construct a differential graph structure learning model by replacing the non-differentiable nearest neighbor sampling with a differentiable sampling.
To alleviate this issue, the bell-shaped Gaussian Similarity (GauSim) modeling is proposed to sample non-nearest neighbors.
We develop a scalable method by transferring the large-scale graph to the transition graph to significantly reduce the complexity.
arXiv Detail & Related papers (2023-12-15T02:45:33Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - On the Optimal Recovery of Graph Signals [10.098114696565865]
We compute regularization parameters that are optimal or near-optimal for graph signal processing problems.
Our results offer a new interpretation for classical optimization techniques in graph-based learning.
We illustrate the potential of our methods in numerical experiments on several semi-synthetic graph signal processing datasets.
arXiv Detail & Related papers (2023-04-02T07:18:11Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Joint graph learning from Gaussian observations in the presence of
hidden nodes [26.133725549667734]
We propose a joint graph learning method that takes into account the presence of hidden (latent) variables.
We exploit the structure resulting from the previous considerations to propose a convex optimization problem.
We compare the proposed algorithm with different baselines and evaluate its performance over synthetic and real-world graphs.
arXiv Detail & Related papers (2022-12-04T13:03:41Z) - Learning Heuristics for the Maximum Clique Enumeration Problem Using Low
Dimensional Representations [0.0]
We use a learning framework for a pruning process of the input graph towards reducing the clique of the maximum enumeration problem.
We study the role of using different vertex representations on the performance of this runtime method.
We observe that using local graph features in the classification process produce more accurate results when combined with a feature elimination process.
arXiv Detail & Related papers (2022-10-30T22:04:32Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z) - Reinforcement Learning with Feedback Graphs [69.1524391595912]
We study episodic reinforcement learning in decision processes when the agent receives additional feedback per step.
We formalize this setting using a feedback graph over state-action pairs and show that model-based algorithms can leverage the additional feedback for more sample-efficient learning.
arXiv Detail & Related papers (2020-05-07T22:35:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.