From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models
- URL: http://arxiv.org/abs/2006.01293v1
- Date: Mon, 1 Jun 2020 22:20:45 GMT
- Title: From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models
- Authors: Aytunc Sahin, Yatao Bian, Joachim M. Buhmann, Andreas Krause
- Abstract summary: Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
- Score: 82.95892656532696
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Submodular functions have been studied extensively in machine learning and
data mining. In particular, the optimization of submodular functions over the
integer lattice (integer submodular functions) has recently attracted much
interest, because this domain relates naturally to many practical problem
settings, such as multilabel graph cut, budget allocation and revenue
maximization with discrete assignments. In contrast, the use of these functions
for probabilistic modeling has received surprisingly little attention so far.
In this work, we firstly propose the Generalized Multilinear Extension, a
continuous DR-submodular extension for integer submodular functions. We study
central properties of this extension and formulate a new probabilistic model
which is defined through integer submodular functions. Then, we introduce a
block-coordinate ascent algorithm to perform approximate inference for those
class of models. Finally, we demonstrate its effectiveness and viability on
several real-world social connection graph datasets with integer submodular
objectives.
Related papers
- Neural Estimation of Submodular Functions with Applications to
Differentiable Subset Selection [50.14730810124592]
Submodular functions and variants, through their ability to characterize diversity and coverage, have emerged as a key tool for data selection and summarization.
We propose FLEXSUBNET, a family of flexible neural models for both monotone and non-monotone submodular functions.
arXiv Detail & Related papers (2022-10-20T06:00:45Z) - Using Partial Monotonicity in Submodular Maximization [13.23676270963484]
We show that for many standard submodular algorithms one can prove new approximation guarantees that depend on the monotonicity ratio.
This leads to improved approximation ratios for the common machine learning applications of movie recommendation, quadratic programming and image summarization.
arXiv Detail & Related papers (2022-02-07T10:35:40Z) - Submodularity In Machine Learning and Artificial Intelligence [0.0]
We offer a plethora of submodular definitions; a full description of example submodular functions and their generalizations.
We then turn to how submodularity is useful in machine learning and artificial intelligence.
arXiv Detail & Related papers (2022-01-31T22:41:35Z) - Sparsification of Decomposable Submodular Functions [27.070004659301162]
Submodular functions are at the core of many machine learning and data mining tasks.
The number of underlying submodular functions in the original function is so large that we need large amount of time to process it and/or it does not even fit in the main memory.
We introduce the notion of sparsification for decomposable submodular functions whose objective is to obtain an accurate approximation of the original function is a (weighted) sum of only a few submodular functions.
arXiv Detail & Related papers (2022-01-18T20:05:25Z) - Multi-objective Evolutionary Algorithms are Generally Good: Maximizing
Monotone Submodular Functions over Sequences [44.11102526976392]
This paper studies the problem class of maximizing monotone submodular functions over sequences.
EAs can achieve good approximation guarantees for solving the problem classes of submodular optimization.
Empirical studies on various applications, e.g., accomplishing tasks, maximizing information gain, search-and-tracking and recommender systems, show the excellent performance of the GSEMO.
arXiv Detail & Related papers (2021-04-20T10:36:10Z) - Learning Aggregation Functions [78.47770735205134]
We introduce LAF (Learning Aggregation Functions), a learnable aggregator for sets of arbitrary cardinality.
We report experiments on semi-synthetic and real data showing that LAF outperforms state-of-the-art sum- (max-) decomposition architectures.
arXiv Detail & Related papers (2020-12-15T18:28:53Z) - Concave Aspects of Submodular Functions [0.0]
Submodular functions generalize several information-theoretic quantities such as entropy and mutual information.
Submodular functions also show signs of concavity.
We characterize the super-differentials and polyhedra associated with upper bounds and provide optimality conditions for submodular using the differentials.
arXiv Detail & Related papers (2020-06-27T17:06:24Z) - Continuous Submodular Function Maximization [91.17492610120324]
Continuous submodularity is a class of functions with a wide spectrum of applications.
We identify several applications of continuous submodular optimization, ranging from influence, MAP for inferences to inferences to field field.
arXiv Detail & Related papers (2020-06-24T04:37:31Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.