Sparsification of Decomposable Submodular Functions
- URL: http://arxiv.org/abs/2201.07289v1
- Date: Tue, 18 Jan 2022 20:05:25 GMT
- Title: Sparsification of Decomposable Submodular Functions
- Authors: Akbar Rafiey, Yuichi Yoshida
- Abstract summary: Submodular functions are at the core of many machine learning and data mining tasks.
The number of underlying submodular functions in the original function is so large that we need large amount of time to process it and/or it does not even fit in the main memory.
We introduce the notion of sparsification for decomposable submodular functions whose objective is to obtain an accurate approximation of the original function is a (weighted) sum of only a few submodular functions.
- Score: 27.070004659301162
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Submodular functions are at the core of many machine learning and data mining
tasks. The underlying submodular functions for many of these tasks are
decomposable, i.e., they are sum of several simple submodular functions. In
many data intensive applications, however, the number of underlying submodular
functions in the original function is so large that we need prohibitively large
amount of time to process it and/or it does not even fit in the main memory. To
overcome this issue, we introduce the notion of sparsification for decomposable
submodular functions whose objective is to obtain an accurate approximation of
the original function that is a (weighted) sum of only a few submodular
functions. Our main result is a polynomial-time randomized sparsification
algorithm such that the expected number of functions used in the output is
independent of the number of underlying submodular functions in the original
function. We also study the effectiveness of our algorithm under various
constraints such as matroid and cardinality constraints. We complement our
theoretical analysis with an empirical study of the performance of our
algorithm.
Related papers
- Maximizing Submodular Functions for Recommendation in the Presence of
Biases [25.081136190260015]
Subset selection tasks arise in systems and search engines and ask to select a subset of items that maximize the value for the user.
In many applications, inputs have been observed to have social biases that reduce the utility of the output subset.
We show that fairness constraint-based interventions can not only ensure proportional representation but also achieve near-optimal utility in the presence of biases.
arXiv Detail & Related papers (2023-05-03T15:20:00Z) - Neural Estimation of Submodular Functions with Applications to
Differentiable Subset Selection [50.14730810124592]
Submodular functions and variants, through their ability to characterize diversity and coverage, have emerged as a key tool for data selection and summarization.
We propose FLEXSUBNET, a family of flexible neural models for both monotone and non-monotone submodular functions.
arXiv Detail & Related papers (2022-10-20T06:00:45Z) - Provable General Function Class Representation Learning in Multitask
Bandits and MDPs [58.624124220900306]
multitask representation learning is a popular approach in reinforcement learning to boost the sample efficiency.
In this work, we extend the analysis to general function class representations.
We theoretically validate the benefit of multitask representation learning within general function class for bandits and linear MDP.
arXiv Detail & Related papers (2022-05-31T11:36:42Z) - Using Partial Monotonicity in Submodular Maximization [13.23676270963484]
We show that for many standard submodular algorithms one can prove new approximation guarantees that depend on the monotonicity ratio.
This leads to improved approximation ratios for the common machine learning applications of movie recommendation, quadratic programming and image summarization.
arXiv Detail & Related papers (2022-02-07T10:35:40Z) - Submodular + Concave [53.208470310734825]
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions.
In this work, we initiate the determinant of the smooth functions convex body $$F(x) = G(x) +C(x)$.
This class of functions is an extension of both concave and continuous DR-submodular functions for which no guarantee is known.
arXiv Detail & Related papers (2021-06-09T01:59:55Z) - Multi-objective Evolutionary Algorithms are Generally Good: Maximizing
Monotone Submodular Functions over Sequences [44.11102526976392]
This paper studies the problem class of maximizing monotone submodular functions over sequences.
EAs can achieve good approximation guarantees for solving the problem classes of submodular optimization.
Empirical studies on various applications, e.g., accomplishing tasks, maximizing information gain, search-and-tracking and recommender systems, show the excellent performance of the GSEMO.
arXiv Detail & Related papers (2021-04-20T10:36:10Z) - Fast and Private Submodular and $k$-Submodular Functions Maximization
with Matroid Constraints [27.070004659301162]
We study the problem of maximizing monotone submodular functions subject to matroid constraints in the framework of differential privacy.
We give the first $frac12$-approximation algorithm that preserves $k$submodular functions subject to matroid constraints.
arXiv Detail & Related papers (2020-06-28T23:18:58Z) - Concave Aspects of Submodular Functions [0.0]
Submodular functions generalize several information-theoretic quantities such as entropy and mutual information.
Submodular functions also show signs of concavity.
We characterize the super-differentials and polyhedra associated with upper bounds and provide optimality conditions for submodular using the differentials.
arXiv Detail & Related papers (2020-06-27T17:06:24Z) - Continuous Submodular Function Maximization [91.17492610120324]
Continuous submodularity is a class of functions with a wide spectrum of applications.
We identify several applications of continuous submodular optimization, ranging from influence, MAP for inferences to inferences to field field.
arXiv Detail & Related papers (2020-06-24T04:37:31Z) - From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models [82.95892656532696]
Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
arXiv Detail & Related papers (2020-06-01T22:20:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.