Linear Query Approximation Algorithms for Non-monotone Submodular
Maximization under Knapsack Constraint
- URL: http://arxiv.org/abs/2305.10292v2
- Date: Mon, 10 Jul 2023 10:40:41 GMT
- Title: Linear Query Approximation Algorithms for Non-monotone Submodular
Maximization under Knapsack Constraint
- Authors: Canh V. Pham, Tan D. Tran, Dung T.K. Ha, My T. Thai
- Abstract summary: This work introduces two constant factor approximation algorithms with linear query complexity for non-monotone submodular over a ground set of size $n$ subject to a knapsack constraint.
$mathsfDLA$ is a deterministic algorithm that provides an approximation factor of $6+epsilon$ while $mathsfRLA$ is a randomized algorithm with an approximation factor of $4+epsilon$.
- Score: 16.02833173359407
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work, for the first time, introduces two constant factor approximation
algorithms with linear query complexity for non-monotone submodular
maximization over a ground set of size $n$ subject to a knapsack constraint,
$\mathsf{DLA}$ and $\mathsf{RLA}$. $\mathsf{DLA}$ is a deterministic algorithm
that provides an approximation factor of $6+\epsilon$ while $\mathsf{RLA}$ is a
randomized algorithm with an approximation factor of $4+\epsilon$. Both run in
$O(n \log(1/\epsilon)/\epsilon)$ query complexity. The key idea to obtain a
constant approximation ratio with linear query lies in: (1) dividing the ground
set into two appropriate subsets to find the near-optimal solution over these
subsets with linear queries, and (2) combining a threshold greedy with
properties of two disjoint sets or a random selection process to improve
solution quality. In addition to the theoretical analysis, we have evaluated
our proposed solutions with three applications: Revenue Maximization, Image
Summarization, and Maximum Weighted Cut, showing that our algorithms not only
return comparative results to state-of-the-art algorithms but also require
significantly fewer queries.
Related papers
- Obtaining Lower Query Complexities through Lightweight Zeroth-Order Proximal Gradient Algorithms [65.42376001308064]
We propose two variance reduced ZO estimators for complex gradient problems.
We improve the state-of-the-art function complexities from $mathcalOleft(minfracdn1/2epsilon2, fracdepsilon3right)$ to $tildecalOleft(fracdepsilon2right)$.
arXiv Detail & Related papers (2024-10-03T15:04:01Z) - Enhanced Deterministic Approximation Algorithm for Non-monotone Submodular Maximization under Knapsack Constraint with Linear Query Complexity [0.0]
We improve the approximation factor of the fastest deterministic algorithm from $6+epsilon$ to $5+epsilon$.
Our technique is based on optimizing the performance of two components: the threshold greedy subroutine and the building of two disjoint sets as candidate solutions.
arXiv Detail & Related papers (2024-05-20T02:24:58Z) - Discretely Beyond $1/e$: Guided Combinatorial Algorithms for Submodular Maximization [13.86054078646307]
For constrained, not necessarily monotone submodular, all known approximation algorithms with ratio greater than $1/e$ require continuous ideas.
For algorithms, the best known approximation ratios for both size and matroid constraint are obtained by a simple randomized greedy algorithm.
arXiv Detail & Related papers (2024-05-08T16:39:59Z) - Robust Approximation Algorithms for Non-monotone $k$-Submodular
Maximization under a Knapsack Constraint [0.0]
Two deterministic approximation algorithms are presented for the problem of non-monotone $k$-submodular complexity under a knapsack constraint.
Our algorithms provide constant approximation ratios within only $O(nk)$ query complexity for the non-monotone objective.
arXiv Detail & Related papers (2023-09-21T12:42:52Z) - A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization [53.044526424637866]
In this paper we consider finding an approximate second-order stationary point (SOSP) that minimizes a twice different subject general non conic optimization.
In particular, we propose a Newton-CG based-augmentedconjugate method for finding an approximate SOSP.
arXiv Detail & Related papers (2023-01-10T20:43:29Z) - Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee [86.05440220344755]
We propose and analyze inexact regularized Newton-type methods for finding a global saddle point of emphcon unconstrained min-max optimization problems.
We show that the proposed methods generate iterates that remain within a bounded set and that the iterations converge to an $epsilon$-saddle point within $O(epsilon-2/3)$ in terms of a restricted function.
arXiv Detail & Related papers (2022-10-23T21:24:37Z) - Best Policy Identification in Linear MDPs [70.57916977441262]
We investigate the problem of best identification in discounted linear Markov+Delta Decision in the fixed confidence setting under a generative model.
The lower bound as the solution of an intricate non- optimization program can be used as the starting point to devise such algorithms.
arXiv Detail & Related papers (2022-08-11T04:12:50Z) - Practical and Parallelizable Algorithms for Non-Monotone Submodular
Maximization with Size Constraint [20.104148319012854]
We present and parallelizable for a submodular function, not necessarily a monotone, with respect to a size constraint.
We improve the best approximation factor achieved by an algorithm that has optimal adaptivity and nearly optimal complexity query to $0.193 - varepsilon$.
arXiv Detail & Related papers (2020-09-03T22:43:55Z) - Revisiting Modified Greedy Algorithm for Monotone Submodular
Maximization with a Knapsack Constraint [75.85952446237599]
We show that a modified greedy algorithm can achieve an approximation factor of $0.305$.
We derive a data-dependent upper bound on the optimum.
It can also be used to significantly improve the efficiency of such algorithms as branch and bound.
arXiv Detail & Related papers (2020-08-12T15:40:21Z) - Linear-Time Algorithms for Adaptive Submodular Maximization [17.19443570570189]
First, we develop a well-studied adaptive submodular problem subject to a cardinality constraint.
Second, we introduce the concept of fully adaptive submodularity.
Our algorithm achieves a $frac1-1/e-epsilon4-2/e-2epsilon$ approximation ratio using only $O(nlogfrac1epsilon)$ number of function evaluations.
arXiv Detail & Related papers (2020-07-08T15:54:28Z) - Second-order Conditional Gradient Sliding [79.66739383117232]
We present the emphSecond-Order Conditional Gradient Sliding (SOCGS) algorithm.
The SOCGS algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations.
It is useful when the feasible region can only be accessed efficiently through a linear optimization oracle.
arXiv Detail & Related papers (2020-02-20T17:52:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.