Approximation Guarantees of Local Search Algorithms via Localizability
of Set Functions
- URL: http://arxiv.org/abs/2006.01400v1
- Date: Tue, 2 Jun 2020 05:37:52 GMT
- Title: Approximation Guarantees of Local Search Algorithms via Localizability
of Set Functions
- Authors: Kaito Fujii
- Abstract summary: Local search is a basic algorithm and is widely used for various optimization problems.
We provide approximation guarantees of standard local search algorithms under various constraints in terms of localizability.
The main application of our framework is sparse optimization, for which we show that restricted strong concavity and restricted smoothness of the objective function imply localizability.
- Score: 9.89901717499058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes a new framework for providing approximation guarantees of
local search algorithms. Local search is a basic algorithm design technique and
is widely used for various combinatorial optimization problems. To analyze
local search algorithms for set function maximization, we propose a new notion
called localizability of set functions, which measures how effective local
improvement is. Moreover, we provide approximation guarantees of standard local
search algorithms under various combinatorial constraints in terms of
localizability. The main application of our framework is sparse optimization,
for which we show that restricted strong concavity and restricted smoothness of
the objective function imply localizability, and further develop accelerated
versions of local search algorithms. We conduct experiments in sparse
regression and structure learning of graphical models to confirm the practical
efficiency of the proposed local search algorithms.
Related papers
- Characterization of Locality in Spin States and Forced Moves for
Optimizations [0.36868085124383626]
In optimization problems, the existence of local minima in energy landscapes is problematic to use to seek the global minimum.
We develop an algorithm to get out of the local minima efficiently while it does not yield the exact samplings.
As the proposed algorithm is based on a rejection-free algorithm, the computational cost is low.
arXiv Detail & Related papers (2023-12-05T07:21:00Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - New Characterizations and Efficient Local Search for General Integer
Linear Programming [17.80124240223163]
This work proposes new characterizations of linear programming (ILP) with the concept of boundary solutions.
We develop a new local search algorithm Local-ILP, which is efficient for solving general ILP validated on a large heterogeneous problem dataset.
Experiments conducted on the MIPLIB dataset show the effectiveness of our algorithm in solving large-scale hard ILP problems.
arXiv Detail & Related papers (2023-04-29T07:22:07Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - A Metaheuristic Algorithm for Large Maximum Weight Independent Set
Problems [58.348679046591265]
Given a node-weighted graph, find a set of independent (mutually nonadjacent) nodes whose node-weight sum is maximum.
Some of the graphs airsing in this application are large, having hundreds of thousands of nodes and hundreds of millions of edges.
We develop a new local search algorithm, which is a metaheuristic in the greedy randomized adaptive search framework.
arXiv Detail & Related papers (2022-03-28T21:34:16Z) - Flipping the switch on local exploration: Genetic Algorithms with
Reversals [0.0]
Authors show that gradient-free search techniques are suitable for providing an optimal solution in the discrete domain.
They also show that the use of multiple local searches can improve performance on local searches.
It is observed that the proposed GA variants have the least average cost across all benchmarks including the problem proposed and IC performs better than its constituents.
arXiv Detail & Related papers (2022-02-02T08:27:11Z) - Community detection using fast low-cardinality semidefinite programming [94.4878715085334]
We propose a new low-cardinality algorithm that generalizes the local update to maximize a semidefinite relaxation derived from Leiden-k-cut.
This proposed algorithm is scalable, outperforms state-of-the-art algorithms, and outperforms in real-world time with little additional cost.
arXiv Detail & Related papers (2020-12-04T15:46:30Z) - Exploiting Local Optimality in Metaheuristic Search [0.0]
We introduce strategies to identify and take advantage of useful features of solution history with an adaptive memory metaheuristic.
Our approach uses a new type of adaptive memory based on a construction called exponential extrapolation.
arXiv Detail & Related papers (2020-10-12T01:51:09Z) - Sequential Subspace Search for Functional Bayesian Optimization
Incorporating Experimenter Intuition [63.011641517977644]
Our algorithm generates a sequence of finite-dimensional random subspaces of functional space spanned by a set of draws from the experimenter's Gaussian Process.
Standard Bayesian optimisation is applied on each subspace, and the best solution found used as a starting point (origin) for the next subspace.
We test our algorithm in simulated and real-world experiments, namely blind function matching, finding the optimal precipitation-strengthening function for an aluminium alloy, and learning rate schedule optimisation for deep networks.
arXiv Detail & Related papers (2020-09-08T06:54:11Z) - IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method [64.15649345392822]
We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex.
Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method.
When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds.
arXiv Detail & Related papers (2020-06-11T18:49:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.