Submodular Maximization under the Intersection of Matroid and Knapsack
Constraints
- URL: http://arxiv.org/abs/2307.09487v1
- Date: Tue, 18 Jul 2023 02:37:14 GMT
- Title: Submodular Maximization under the Intersection of Matroid and Knapsack
Constraints
- Authors: Yu-Ran Gu, Chao Bian and Chao Qian
- Abstract summary: We consider the problem of submodular operations under the intersection of two commonly used constraints, i.e., $k$matroid constraint and $m$-knapsack constraint.
We prove that SPROUTOUT can achieve a SPR-time approximation guarantee better than incorporating state-of-the-art algorithms.
Experiments on the applications of movie recommendation and weighted max-cut demonstrate the superiority of SPROUT++ in practice.
- Score: 23.0838604893412
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Submodular maximization arises in many applications, and has attracted a lot
of research attentions from various areas such as artificial intelligence,
finance and operations research. Previous studies mainly consider only one kind
of constraint, while many real-world problems often involve several
constraints. In this paper, we consider the problem of submodular maximization
under the intersection of two commonly used constraints, i.e., $k$-matroid
constraint and $m$-knapsack constraint, and propose a new algorithm SPROUT by
incorporating partial enumeration into the simultaneous greedy framework. We
prove that SPROUT can achieve a polynomial-time approximation guarantee better
than the state-of-the-art algorithms. Then, we introduce the random enumeration
and smooth techniques into SPROUT to improve its efficiency, resulting in the
SPROUT++ algorithm, which can keep a similar approximation guarantee.
Experiments on the applications of movie recommendation and weighted max-cut
demonstrate the superiority of SPROUT++ in practice.
Related papers
- Practical Parallel Algorithms for Non-Monotone Submodular Maximization [20.13836086815112]
Submodular has found extensive applications in various domains within the field of artificial intelligence.
One of the parallelizability of a submodular algorithm is its adaptive complexity, which indicates the number of rounds where a number of queries to the objective function can be executed in parallel.
We propose the first algorithm with both provable approximation ratio and sub adaptive complexity for the problem of non-monotone submodepsular subject to a $k$-system.
arXiv Detail & Related papers (2023-08-21T11:48:34Z) - Quantum Goemans-Williamson Algorithm with the Hadamard Test and
Approximate Amplitude Constraints [62.72309460291971]
We introduce a variational quantum algorithm for Goemans-Williamson algorithm that uses only $n+1$ qubits.
Efficient optimization is achieved by encoding the objective matrix as a properly parameterized unitary conditioned on an auxilary qubit.
We demonstrate the effectiveness of our protocol by devising an efficient quantum implementation of the Goemans-Williamson algorithm for various NP-hard problems.
arXiv Detail & Related papers (2022-06-30T03:15:23Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Minimax Optimization with Smooth Algorithmic Adversaries [59.47122537182611]
We propose a new algorithm for the min-player against smooth algorithms deployed by an adversary.
Our algorithm is guaranteed to make monotonic progress having no limit cycles, and to find an appropriate number of gradient ascents.
arXiv Detail & Related papers (2021-06-02T22:03:36Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Fast and Private Submodular and $k$-Submodular Functions Maximization
with Matroid Constraints [27.070004659301162]
We study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.
We give the first $frac12$-approximation algorithm that preserves $k$submodular functions subject to matroid constraints.
arXiv Detail & Related papers (2020-06-28T23:18:58Z) - Maximizing Submodular or Monotone Functions under Partition Matroid
Constraints by Multi-objective Evolutionary Algorithms [16.691265882753346]
A simple evolutionary algorithm called GSEMO has been shown to achieve good approximation for submodular functions efficiently.
We extend theoretical results to partition matroid constraints which generalize cardinality constraints.
arXiv Detail & Related papers (2020-06-23T05:37:29Z) - Submodular Maximization in Clean Linear Time [42.51873363778082]
We provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodularimation under a cardinality (size) constraint.
We show that when the cardinality allowed for a solution is constant, no algorithm making a sub-linear number of function evaluations can guarantee any constant ratio.
We extensively evaluate our algorithms on multiple real-life machine learning applications, including movie recommendation, location summarization, twitter text summarization and video summarization.
arXiv Detail & Related papers (2020-06-16T17:06:45Z) - Fast Objective & Duality Gap Convergence for Non-Convex Strongly-Concave
Min-Max Problems with PL Condition [52.08417569774822]
This paper focuses on methods for solving smooth non-concave min-max problems, which have received increasing attention due to deep learning (e.g., deep AUC)
arXiv Detail & Related papers (2020-06-12T00:32:21Z) - Submodular Bandit Problem Under Multiple Constraints [8.100450025624443]
We introduce a submodular bandit problem under the intersection of $l$ knapsacks and a $k$-system constraint.
To solve this problem, we propose a non-greedy algorithm that adaptively focuses on a standard or modified upper-confidence bound.
We provide a high-probability upper bound of an approximation regret, where the approximation ratio matches that of a fast algorithm.
arXiv Detail & Related papers (2020-06-01T01:28:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.