Neural Estimation of Submodular Functions with Applications to
Differentiable Subset Selection
- URL: http://arxiv.org/abs/2210.11033v1
- Date: Thu, 20 Oct 2022 06:00:45 GMT
- Title: Neural Estimation of Submodular Functions with Applications to
Differentiable Subset Selection
- Authors: Abir De and Soumen Chakrabarti
- Abstract summary: Submodular functions and variants, through their ability to characterize diversity and coverage, have emerged as a key tool for data selection and summarization.
We propose FLEXSUBNET, a family of flexible neural models for both monotone and non-monotone submodular functions.
- Score: 50.14730810124592
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Submodular functions and variants, through their ability to characterize
diversity and coverage, have emerged as a key tool for data selection and
summarization. Many recent approaches to learn submodular functions suffer from
limited expressiveness. In this work, we propose FLEXSUBNET, a family of
flexible neural models for both monotone and non-monotone submodular functions.
To fit a latent submodular function from (set, value) observations, FLEXSUBNET
applies a concave function on modular functions in a recursive manner. We do
not draw the concave function from a restricted family, but rather learn from
data using a highly expressive neural network that implements a differentiable
quadrature procedure. Such an expressive neural model for concave functions may
be of independent interest. Next, we extend this setup to provide a novel
characterization of monotone \alpha-submodular functions, a recently introduced
notion of approximate submodular functions. We then use this characterization
to design a novel neural model for such functions. Finally, we consider
learning submodular set functions under distant supervision in the form of
(perimeter-set, high-value-subset) pairs. This yields a novel subset selection
method based on an order-invariant, yet greedy sampler built around the above
neural set functions. Our experiments on synthetic and real data show that
FLEXSUBNET outperforms several baselines.
Related papers
- Emergent Modularity in Pre-trained Transformers [127.08792763817496]
We consider two main characteristics of modularity: functional specialization of neurons and function-based neuron grouping.
We study how modularity emerges during pre-training, and find that the modular structure is stabilized at the early stage.
It suggests that Transformers first construct the modular structure and then learn fine-grained neuron functions.
arXiv Detail & Related papers (2023-05-28T11:02:32Z) - Equivariance with Learned Canonicalization Functions [77.32483958400282]
We show that learning a small neural network to perform canonicalization is better than using predefineds.
Our experiments show that learning the canonicalization function is competitive with existing techniques for learning equivariant functions across many tasks.
arXiv Detail & Related papers (2022-11-11T21:58:15Z) - Submodularity In Machine Learning and Artificial Intelligence [0.0]
We offer a plethora of submodular definitions; a full description of example submodular functions and their generalizations.
We then turn to how submodularity is useful in machine learning and artificial intelligence.
arXiv Detail & Related papers (2022-01-31T22:41:35Z) - Sparsification of Decomposable Submodular Functions [27.070004659301162]
Submodular functions are at the core of many machine learning and data mining tasks.
The number of underlying submodular functions in the original function is so large that we need large amount of time to process it and/or it does not even fit in the main memory.
We introduce the notion of sparsification for decomposable submodular functions whose objective is to obtain an accurate approximation of the original function is a (weighted) sum of only a few submodular functions.
arXiv Detail & Related papers (2022-01-18T20:05:25Z) - Modern Non-Linear Function-on-Function Regression [8.231050911072755]
We introduce a new class of non-linear function-on-function regression models for functional data using neural networks.
We give two model fitting strategies, Functional Direct Neural Network (FDNN) and Functional Basis Neural Network (FBNN)
arXiv Detail & Related papers (2021-07-29T16:19:59Z) - Submodular + Concave [53.208470310734825]
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions.
In this work, we initiate the determinant of the smooth functions convex body $$F(x) = G(x) +C(x)$.
This class of functions is an extension of both concave and continuous DR-submodular functions for which no guarantee is known.
arXiv Detail & Related papers (2021-06-09T01:59:55Z) - Continuous Submodular Function Maximization [91.17492610120324]
Continuous submodularity is a class of functions with a wide spectrum of applications.
We identify several applications of continuous submodular optimization, ranging from influence, MAP for inferences to inferences to field field.
arXiv Detail & Related papers (2020-06-24T04:37:31Z) - A Parameterized Family of Meta-Submodular Functions [3.233545237942899]
We give a broad parameterized family of monotone functions which includes submodular functions and a class of supermodular functions containing diversity functions.
We show that the $gamma$-submodular families include well-known classes of functions such as meta-submodular functions, metric diversity functions and submodular functions.
arXiv Detail & Related papers (2020-06-23T16:45:12Z) - From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models [82.95892656532696]
Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
arXiv Detail & Related papers (2020-06-01T22:20:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.