Fast and Private Submodular and $k$-Submodular Functions Maximization
with Matroid Constraints
- URL: http://arxiv.org/abs/2006.15744v1
- Date: Sun, 28 Jun 2020 23:18:58 GMT
- Title: Fast and Private Submodular and $k$-Submodular Functions Maximization
with Matroid Constraints
- Authors: Akbar Rafiey, Yuichi Yoshida
- Abstract summary: We study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.
We give the first $frac12$-approximation algorithm that preserves $k$submodular functions subject to matroid constraints.
- Score: 27.070004659301162
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The problem of maximizing nonnegative monotone submodular functions under a
certain constraint has been intensively studied in the last decade, and a wide
range of efficient approximation algorithms have been developed for this
problem. Many machine learning problems, including data summarization and
influence maximization, can be naturally modeled as the problem of maximizing
monotone submodular functions. However, when such applications involve
sensitive data about individuals, their privacy concerns should be addressed.
In this paper, we study the problem of maximizing monotone submodular functions
subject to matroid constraints in the framework of differential privacy. We
provide $(1-\frac{1}{\mathrm{e}})$-approximation algorithm which improves upon
the previous results in terms of approximation guarantee. This is done with an
almost cubic number of function evaluations in our algorithm.
Moreover, we study $k$-submodularity, a natural generalization of
submodularity. We give the first $\frac{1}{2}$-approximation algorithm that
preserves differential privacy for maximizing monotone $k$-submodular functions
subject to matroid constraints. The approximation ratio is asymptotically tight
and is obtained with an almost linear number of function evaluations.
Related papers
- Dynamic Non-monotone Submodular Maximization [11.354502646593607]
We show a reduction from maximizing a non-monotone submodular function under the cardinality constraint $k$ to maximizing a monotone submodular function under the same constraint.
Our algorithms maintain an $(epsilon)$-approximate of the solution and use expected amortized $O(epsilon-3k3log3(n)log(k)$ queries per update.
arXiv Detail & Related papers (2023-11-07T03:20:02Z) - Stochastic Submodular Maximization via Polynomial Estimators [13.498923494159312]
We focus on maximizing submodular functions that are defined as expectations over a class of submodular functions with an unknown distribution.
We show that for monotone functions of this form, greedy continuous algorithm attains an approximation ratio (in expectation) arbitrarily close to $(1-1/e) approx 63%$ using a estimation.
arXiv Detail & Related papers (2023-03-17T13:32:33Z) - Randomized Algorithms for Monotone Submodular Function Maximization on
the Integer Lattice [5.543220407902113]
We consider the problem of maximizing a monotone submodular function on a bounded integer lattice subject to a cardinality constraint.
In particular, we focus on maximizing DR-submodular functions, i.e., functions defined on the integer lattice that exhibit the diminishing returns property.
Applying our proposed algorithm on the integer lattice is faster than the alternatives, including reducing a target problem to the set domain and then applying the fastest known set submodular algorithm.
arXiv Detail & Related papers (2021-11-19T12:07:16Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Submodular + Concave [53.208470310734825]
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions.
In this work, we initiate the determinant of the smooth functions convex body $$F(x) = G(x) +C(x)$.
This class of functions is an extension of both concave and continuous DR-submodular functions for which no guarantee is known.
arXiv Detail & Related papers (2021-06-09T01:59:55Z) - Revisiting Modified Greedy Algorithm for Monotone Submodular
Maximization with a Knapsack Constraint [75.85952446237599]
We show that a modified greedy algorithm can achieve an approximation factor of $0.305$.
We derive a data-dependent upper bound on the optimum.
It can also be used to significantly improve the efficiency of such algorithms as branch and bound.
arXiv Detail & Related papers (2020-08-12T15:40:21Z) - Continuous Submodular Function Maximization [91.17492610120324]
Continuous submodularity is a class of functions with a wide spectrum of applications.
We identify several applications of continuous submodular optimization, ranging from influence, MAP for inferences to inferences to field field.
arXiv Detail & Related papers (2020-06-24T04:37:31Z) - Submodular Maximization in Clean Linear Time [42.51873363778082]
We provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodularimation under a cardinality (size) constraint.
We show that when the cardinality allowed for a solution is constant, no algorithm making a sub-linear number of function evaluations can guarantee any constant ratio.
We extensively evaluate our algorithms on multiple real-life machine learning applications, including movie recommendation, location summarization, twitter text summarization and video summarization.
arXiv Detail & Related papers (2020-06-16T17:06:45Z) - Exploiting Higher Order Smoothness in Derivative-free Optimization and
Continuous Bandits [99.70167985955352]
We study the problem of zero-order optimization of a strongly convex function.
We consider a randomized approximation of the projected gradient descent algorithm.
Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters.
arXiv Detail & Related papers (2020-06-14T10:42:23Z) - From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models [82.95892656532696]
Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
arXiv Detail & Related papers (2020-06-01T22:20:45Z) - Differentially Private Decomposable Submodular Maximization [12.835348339847762]
We study the problem of differentially private constrained of decomposable submodular functions.
A submodular function is decomposable if it takes the form of a sum of submodular functions.
We design differentially private algorithms for both monotone and non-monotone decomposable submodular under general matroid constraints.
arXiv Detail & Related papers (2020-05-29T17:59:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.