The transport problem for non-additive measures
- URL: http://arxiv.org/abs/2211.12150v2
- Date: Wed, 23 Nov 2022 15:29:05 GMT
- Title: The transport problem for non-additive measures
- Authors: Vicen\c{c} Torra
- Abstract summary: Non-additive measures are more general than additive ones.
Non-additive measures have better modeling capabilities.
There is an increasing need to analyze non-additive measures.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Non-additive measures, also known as fuzzy measures, capacities, and
monotonic games, are increasingly used in different fields. Applications have
been built within computer science and artificial intelligence related to e.g.
decision making, image processing, machine learning for both classification,
and regression. Tools for measure identification have been built. In short, as
non-additive measures are more general than additive ones (i.e., than
probabilities), they have better modeling capabilities allowing to model
situations and problems that cannot be modelled by the latter. See e.g. the
application of non-additive measures and the Choquet integral to model both
Ellsberg paradox and Allais paradox.
Because of that, there is an increasing need to analyze non-additive
measures. The need for distances and similarities to compare them is no
exception. Some work has been done for definining $f$-divergence for them. In
this work we tackle the problem of definining the transport problem for
non-additive measures, which has not been considered up to our knowledge up to
now. Distances for pairs of probability distributions based on the optimal
transport are extremely used in practical applications, and they are being
studied extensively for the mathematical properties. We consider that it is
necessary to provide appropriate definitions with a similar flavour, and that
generalize the standard ones, for non-additive measures.
We provide definitions based on the M\"obius transform, but also based on the
$(\max, +)$-transform that we consider that has some advantages. We will
discuss in this paper the problems that arise to define the transport problem
for non-additive measures, and discuss ways to solve them. In this paper we
provide the definitions of the optimal transport problem, and prove some
properties.
Related papers
- Debiasing Machine Learning Models by Using Weakly Supervised Learning [3.3298048942057523]
We tackle the problem of bias mitigation of algorithmic decisions in a setting where both the output of the algorithm and the sensitive variable are continuous.
Typical examples are unfair decisions made with respect to the age or the financial status.
Our bias mitigation strategy is a weakly supervised learning method which requires that a small portion of the data can be measured in a fair manner.
arXiv Detail & Related papers (2024-02-23T18:11:32Z) - Tempered Calculus for ML: Application to Hyperbolic Model Embedding [74.82054459297169]
Most mathematical distortions used in ML are fundamentally integral in nature.
In this paper, we unveil a grounded theory and tools which can help improve these distortions to better cope with ML requirements.
We show how to apply it to a problem that has recently gained traction in ML: hyperbolic embeddings with a "cheap" and accurate encoding along the hyperbolic vsean scale.
arXiv Detail & Related papers (2024-02-06T17:21:06Z) - Scalable Unbalanced Sobolev Transport for Measures on a Graph [23.99177001129992]
Optimal transport (OT) is a powerful tool for comparing probability measures.
OT suffers a few drawbacks: (i) input measures required to have the same mass, (ii) a high computational complexity, and (iii) indefiniteness.
Le et al. (2022) recently proposed Sobolev transport for measures on a graph having the same total mass by leveraging the graph structure over supports.
We show that the proposed unbalanced Sobolev transport admits a closed-form formula for fast computation, and it is also negative definite.
arXiv Detail & Related papers (2023-02-24T07:35:38Z) - involve-MI: Informative Planning with High-Dimensional Non-Parametric
Beliefs [6.62472687864754]
We calculate an information-theoretic expected reward, mutual information (MI), over a much lower-dimensional subset of the state, to improve efficiency and without sacrificing accuracy.
We then develop an estimator for the MI which works in a Sequential Monte Carlo manner, and avoids the reconstruction of future belief's surfaces.
This work is then evaluated in a simulation of an active SLAM problem, where the improvement in both accuracy and timing is demonstrated.
arXiv Detail & Related papers (2022-09-23T13:51:36Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - Causal Inference Under Unmeasured Confounding With Negative Controls: A
Minimax Learning Approach [84.29777236590674]
We study the estimation of causal parameters when not all confounders are observed and instead negative controls are available.
Recent work has shown how these can enable identification and efficient estimation via two so-called bridge functions.
arXiv Detail & Related papers (2021-03-25T17:59:19Z) - Sliced Multi-Marginal Optimal Transport [21.82052188474956]
We study multi-marginal optimal transport, a generalization of optimal transport that allows us to define discrepancies between multiple measures.
We show that computing the sliced multi-marginal discrepancy is massively scalable for a large number of probability measures with support as large as $107$ samples.
arXiv Detail & Related papers (2021-02-14T09:58:47Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Fundamental Limits and Tradeoffs in Invariant Representation Learning [99.2368462915979]
Many machine learning applications involve learning representations that achieve two competing goals.
Minimax game-theoretic formulation represents a fundamental tradeoff between accuracy and invariance.
We provide an information-theoretic analysis of this general and important problem under both classification and regression settings.
arXiv Detail & Related papers (2020-12-19T15:24:04Z) - What can I do here? A Theory of Affordances in Reinforcement Learning [65.70524105802156]
We develop a theory of affordances for agents who learn and plan in Markov Decision Processes.
Affordances play a dual role in this case, by reducing the number of actions available in any given situation.
We propose an approach to learn affordances and use it to estimate transition models that are simpler and generalize better.
arXiv Detail & Related papers (2020-06-26T16:34:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.