Beyond Pointwise Submodularity: Non-Monotone Adaptive Submodular
Maximization subject to a Knapsack Constraint
- URL: http://arxiv.org/abs/2104.04853v1
- Date: Sat, 10 Apr 2021 20:11:11 GMT
- Title: Beyond Pointwise Submodularity: Non-Monotone Adaptive Submodular
Maximization subject to a Knapsack Constraint
- Authors: Shaojie Tang
- Abstract summary: We study the non-monotone adaptive submodular problem subject to a knapsack constraint.
The state of an item is initially unknown, one must select an item in order to reveal the state of that item.
Our main contribution is to develop a sampling-based randomized algorithm that achieves a $frac1$ approximation for maximizing an adaptive submodular function.
- Score: 26.171841086317574
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we study the non-monotone adaptive submodular maximization
problem subject to a knapsack constraint. The input of our problem is a set of
items, where each item has a particular state drawn from a known prior
distribution. However, the state of an item is initially unknown, one must
select an item in order to reveal the state of that item. Moreover, each item
has a fixed cost. There is a utility function which is defined over items and
states. Our objective is to sequentially select a group of items to maximize
the expected utility subject to a knapsack constraint. Although the
cardinality-constrained, as well as the more general matroid-constrained,
adaptive submodular maximization has been well studied in the literature,
whether there exists a constant approximation solution for the
knapsack-constrained adaptive submodular maximization problem remains an open
problem. We fill this gap by proposing the first constant approximation
solution. In particular, our main contribution is to develop a sampling-based
randomized algorithm that achieves a $\frac{1}{10}$ approximation for
maximizing an adaptive submodular function subject to a knapsack constraint.
Related papers
- Non-monotone Sequential Submodular Maximization [13.619980548779687]
We aim to select and rank a group of $k$ items from a ground set $V$ such that the weighted assortment of $k$ is maximized.
The results of this research have implications in various fields, including recommendation systems and optimization.
arXiv Detail & Related papers (2023-08-16T19:32:29Z) - Group Equality in Adaptive Submodular Maximization [13.619980548779687]
We study the classic submodular problem subject to a group equality constraint under both non-adaptive and adaptive settings.
We develop the first constant approximation algorithm for this problem.
arXiv Detail & Related papers (2022-07-07T15:12:02Z) - Fast Feature Selection with Fairness Constraints [49.142308856826396]
We study the fundamental problem of selecting optimal features for model construction.
This problem is computationally challenging on large datasets, even with the use of greedy algorithm variants.
We extend the adaptive query model, recently proposed for the greedy forward selection for submodular functions, to the faster paradigm of Orthogonal Matching Pursuit for non-submodular functions.
The proposed algorithm achieves exponentially fast parallel run time in the adaptive query model, scaling much better than prior work.
arXiv Detail & Related papers (2022-02-28T12:26:47Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Submodular + Concave [53.208470310734825]
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions.
In this work, we initiate the determinant of the smooth functions convex body $$F(x) = G(x) +C(x)$.
This class of functions is an extension of both concave and continuous DR-submodular functions for which no guarantee is known.
arXiv Detail & Related papers (2021-06-09T01:59:55Z) - Adaptive Cascade Submodular Maximization [19.29174615532181]
We study the cascade submodular problem under the adaptive setting.
Our objective is to identify the best sequence of selecting items so as to maximize the expected utility of the selected items.
arXiv Detail & Related papers (2020-07-07T16:21:56Z) - Fast and Private Submodular and $k$-Submodular Functions Maximization
with Matroid Constraints [27.070004659301162]
We study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.
We give the first $frac12$-approximation algorithm that preserves $k$submodular functions subject to matroid constraints.
arXiv Detail & Related papers (2020-06-28T23:18:58Z) - Continuous Submodular Function Maximization [91.17492610120324]
Continuous submodularity is a class of functions with a wide spectrum of applications.
We identify several applications of continuous submodular optimization, ranging from influence, MAP for inferences to inferences to field field.
arXiv Detail & Related papers (2020-06-24T04:37:31Z) - Submodular Maximization in Clean Linear Time [42.51873363778082]
We provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodularimation under a cardinality (size) constraint.
We show that when the cardinality allowed for a solution is constant, no algorithm making a sub-linear number of function evaluations can guarantee any constant ratio.
We extensively evaluate our algorithms on multiple real-life machine learning applications, including movie recommendation, location summarization, twitter text summarization and video summarization.
arXiv Detail & Related papers (2020-06-16T17:06:45Z) - Non-Adaptive Adaptive Sampling on Turnstile Streams [57.619901304728366]
We give the first relative-error algorithms for column subset selection, subspace approximation, projective clustering, and volume on turnstile streams that use space sublinear in $n$.
Our adaptive sampling procedure has a number of applications to various data summarization problems that either improve state-of-the-art or have only been previously studied in the more relaxed row-arrival model.
arXiv Detail & Related papers (2020-04-23T05:00:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.