Focused Jump-and-Repair Constraint Handling for Fixed-Parameter
Tractable Graph Problems Closed Under Induced Subgraphs
- URL: http://arxiv.org/abs/2203.13877v2
- Date: Fri, 20 Jan 2023 22:52:12 GMT
- Title: Focused Jump-and-Repair Constraint Handling for Fixed-Parameter
Tractable Graph Problems Closed Under Induced Subgraphs
- Authors: Luke Branson and Andrew M. Sutton
- Abstract summary: We investigate a (1+1)EA equipped with a tailored jump-and-repair operation that can be used to probabilistically repair infeasible offspring in graph problems.
- Score: 3.495114525631289
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Repair operators are often used for constraint handling in constrained
combinatorial optimization. We investigate the (1+1)~EA equipped with a
tailored jump-and-repair operation that can be used to probabilistically repair
infeasible offspring in graph problems. Instead of evolving candidate solutions
to the entire graph, we expand the genotype to allow the (1+1)~EA to develop in
parallel a feasible solution together with a growing subset of the instance (an
induced subgraph). With this approach, we prove that the EA is able to
probabilistically simulate an iterative compression process used in classical
fixed-parameter algorithmics to obtain a randomized FPT performance guarantee
on three NP-hard graph problems. For $k$-VertexCover, we prove that the (1+1)
EA using focused jump-and-repair can find a $k$-vertex cover (if one exists) in
$O(2^k n^2\log n)$ iterations in expectation. This leads to an exponential (in
$k$) improvement over the best-known parameterized bound for evolutionary
algorithms on VertexCover. For the $k$-FeedbackVertexSet problem in
tournaments, we prove that the EA finds a feasible feedback set in
$O(2^kk!n^2\log n)$ iterations in expectation, and for OddCycleTransversal, we
prove the optimization time for the EA is $O(3^k k m n^2\log n)$. For the
latter two problems, this constitutes the first parameterized result for any
evolutionary algorithm. We discuss how to generalize the framework to other
parameterized graph problems closed under induced subgraphs and report
experimental results that illustrate the behavior of the algorithm on a
concrete instance class.
Related papers
- Analyzing the Runtime of the Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) on the Concatenated Trap Function [2.038038953957366]
GOMEA is an evolutionary algorithm that leverages linkage learning to efficiently exploit problem structure.
We show that GOMEA can solve the problem in $O(m32k)$ with high probability, where $m$ is the number of subfunctions and $k$ is the subfunction length.
This is a significant speedup compared to the (1+1) Evolutionary EA, which requires $O(ln(m)(mk)k)$ expected evaluations.
arXiv Detail & Related papers (2024-07-11T09:37:21Z) - ReSQueing Parallel and Private Stochastic Convex Optimization [59.53297063174519]
We introduce a new tool for BFG convex optimization (SCO): a Reweighted Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density.
We develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings.
arXiv Detail & Related papers (2023-01-01T18:51:29Z) - Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee [86.05440220344755]
We propose and analyze inexact regularized Newton-type methods for finding a global saddle point of emphcon unconstrained min-max optimization problems.
We show that the proposed methods generate iterates that remain within a bounded set and that the iterations converge to an $epsilon$-saddle point within $O(epsilon-2/3)$ in terms of a restricted function.
arXiv Detail & Related papers (2022-10-23T21:24:37Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization [116.89941263390769]
We consider the smooth convex-concave bilinearly-coupled saddle-point problem, $min_mathbfxmax_mathbfyF(mathbfx) + H(mathbfx,mathbfy)$, where one has access to first-order oracles for $F$, $G$ as well as the bilinear coupling function $H$.
We present a emphaccelerated gradient-extragradient (AG-EG) descent-ascent algorithm that combines extragrad
arXiv Detail & Related papers (2022-06-17T06:10:20Z) - Bayesian Optimistic Optimisation with Exponentially Decaying Regret [58.02542541410322]
The current practical BO algorithms have regret bounds ranging from $mathcalO(fraclogNsqrtN)$ to $mathcal O(e-sqrtN)$, where $N$ is the number of evaluations.
This paper explores the possibility of improving the regret bound in the noiseless setting by intertwining concepts from BO and tree-based optimistic optimisation.
We propose the BOO algorithm, a first practical approach which can achieve an exponential regret bound with order $mathcal O(N-sqrt
arXiv Detail & Related papers (2021-05-10T13:07:44Z) - A Momentum-Assisted Single-Timescale Stochastic Approximation Algorithm
for Bilevel Optimization [112.59170319105971]
We propose a new algorithm -- the Momentum- Single-timescale Approximation (MSTSA) -- for tackling problems.
MSTSA allows us to control the error in iterations due to inaccurate solution to the lower level subproblem.
arXiv Detail & Related papers (2021-02-15T07:10:33Z) - Improved Runtime Results for Simple Randomised Search Heuristics on
Linear Functions with a Uniform Constraint [11.228244128564512]
We study the class of linear functions under uniform constraint and investigate the expected optimisation time of Randomised Local Search (RLS)
We prove a tight bound of $Theta(n2)$ for RLS and improve the previously best known upper bound of (1+1) EA.
arXiv Detail & Related papers (2020-10-21T10:42:39Z) - S-ADDOPT: Decentralized stochastic first-order optimization over
directed graphs [16.96562173221624]
Decentralized convex optimization is proposed to minimize a sum of smooth and strongly cost functions when the functions are distributed over a directed network nodes.
In particular, we propose thetextbftextttS-ADDOPT algorithm that assumes a first-order oracle at each node.
For decaying step-sizes$mathcalO (1/k)$, we show thattextbfttS-ADDOPT reaches the exact solution subly at$mathcalO (1/k)$ and its convergence is networkally-independent
arXiv Detail & Related papers (2020-05-15T21:14:22Z) - A New Randomized Primal-Dual Algorithm for Convex Optimization with
Optimal Last Iterate Rates [16.54912614895861]
We develop a novel unified randomized block-coordinate primal-dual algorithm to solve a class of nonsmooth constrained convex optimization problems.
We prove that our algorithm achieves optimal convergence rates in two cases: general convexity and strong convexity.
Our results show that the proposed method has encouraging performance on different experiments.
arXiv Detail & Related papers (2020-03-03T03:59:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.