Optimal Transport for Measures with Noisy Tree Metric
- URL: http://arxiv.org/abs/2310.13653v3
- Date: Fri, 1 Mar 2024 02:23:59 GMT
- Title: Optimal Transport for Measures with Noisy Tree Metric
- Authors: Tam Le, Truyen Nguyen, Kenji Fukumizu
- Abstract summary: We study optimal transport problem for probability measures supported on a tree metric space.
In general, this approach is hard to compute, even for measures supported in one space.
We show that the robust OT satisfies the metric property and is negative definite.
- Score: 29.950797721275574
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We study optimal transport (OT) problem for probability measures supported on
a tree metric space. It is known that such OT problem (i.e., tree-Wasserstein
(TW)) admits a closed-form expression, but depends fundamentally on the
underlying tree structure over supports of input measures. In practice, the
given tree structure may be, however, perturbed due to noisy or adversarial
measurements. To mitigate this issue, we follow the max-min robust OT approach
which considers the maximal possible distances between two input measures over
an uncertainty set of tree metrics. In general, this approach is hard to
compute, even for measures supported in one-dimensional space, due to its
non-convexity and non-smoothness which hinders its practical applications,
especially for large-scale settings. In this work, we propose novel uncertainty
sets of tree metrics from the lens of edge deletion/addition which covers a
diversity of tree structures in an elegant framework. Consequently, by building
upon the proposed uncertainty sets, and leveraging the tree structure over
supports, we show that the robust OT also admits a closed-form expression for a
fast computation as its counterpart standard OT (i.e., TW). Furthermore, we
demonstrate that the robust OT satisfies the metric property and is negative
definite. We then exploit its negative definiteness to propose positive
definite kernels and test them in several simulations on various real-world
datasets on document classification and topological data analysis.
Related papers
- Fast unsupervised ground metric learning with tree-Wasserstein distance [14.235762519615175]
unsupervised ground metric learning approaches have been introduced.
We propose to augment the WSV method by embedding samples and features on trees, on which we compute the tree-Wasserstein distance (TWD)
We demonstrate theoretically and empirically that the algorithm converges to a better approximation of the full WSV approach than the best known alternatives, and does so with $mathcalO(n3)$ complexity.
arXiv Detail & Related papers (2024-11-11T23:21:01Z) - Identifying General Mechanism Shifts in Linear Causal Representations [58.6238439611389]
We consider the linear causal representation learning setting where we observe a linear mixing of $d$ unknown latent factors.
Recent work has shown that it is possible to recover the latent factors as well as the underlying structural causal model over them.
We provide a surprising identifiability result that it is indeed possible, under some very mild standard assumptions, to identify the set of shifted nodes.
arXiv Detail & Related papers (2024-10-31T15:56:50Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Topology-Aware Uncertainty for Image Segmentation [19.248891926246383]
We focus on uncertainty estimation for such tasks, so that highly uncertain, and thus error-prone structures can be identified for human annotators to verify.
We propose a joint prediction model that estimates the uncertainty of a structure while taking the neighboring structures into consideration.
We also propose a novel Probabilistic DMT to model the inherent uncertainty within each structure.
arXiv Detail & Related papers (2023-06-09T05:01:55Z) - Learning Ultrametric Trees for Optimal Transport Regression [10.524752369156337]
We find an optimal tree structure for a given discrete metric space.
One of our key ideas is to cast the problem in ultrametric spaces.
arXiv Detail & Related papers (2022-10-21T22:54:42Z) - Lassoed Tree Boosting [53.56229983630983]
We prove that a gradient boosted tree algorithm with early stopping faster than $n-1/4$ L2 convergence in the large nonparametric space of cadlag functions of bounded sectional variation.
Our convergence proofs are based on a novel, general theorem on early stopping with empirical loss minimizers of nested Donsker classes.
arXiv Detail & Related papers (2022-05-22T00:34:41Z) - Spectral Top-Down Recovery of Latent Tree Models [13.681975313065477]
Spectral Top-Down Recovery (STDR) is a divide-and-conquer approach for inference of large latent tree models.
STDR's partitioning step is non-random. Instead, it is based on the Fiedler vector of a suitable Laplacian matrix related to the observed nodes.
We prove that STDR is statistically consistent, and bound the number of samples required to accurately recover the tree with high probability.
arXiv Detail & Related papers (2021-02-26T02:47:42Z) - Entropy Partial Transport with Tree Metrics: Theory and Practice [5.025654873456756]
We consider an textitentropy partial transport (EPT) problem for nonnegative measures on a tree having different masses.
We propose a novel regularization for EPT which admits fast computation and negative definiteness.
We empirically demonstrate that our regularization also provides effective approximations.
arXiv Detail & Related papers (2021-01-24T17:04:24Z) - SGA: A Robust Algorithm for Partial Recovery of Tree-Structured
Graphical Models with Noisy Samples [75.32013242448151]
We consider learning Ising tree models when the observations from the nodes are corrupted by independent but non-identically distributed noise.
Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a partial tree structure.
We propose Symmetrized Geometric Averaging (SGA), a more statistically robust algorithm for partial tree recovery.
arXiv Detail & Related papers (2021-01-22T01:57:35Z) - Convex Polytope Trees [57.56078843831244]
convex polytope trees (CPT) are proposed to expand the family of decision trees by an interpretable generalization of their decision boundary.
We develop a greedy method to efficiently construct CPT and scalable end-to-end training algorithms for the tree parameters when the tree structure is given.
arXiv Detail & Related papers (2020-10-21T19:38:57Z) - $\gamma$-ABC: Outlier-Robust Approximate Bayesian Computation Based on a
Robust Divergence Estimator [95.71091446753414]
We propose to use a nearest-neighbor-based $gamma$-divergence estimator as a data discrepancy measure.
Our method achieves significantly higher robustness than existing discrepancy measures.
arXiv Detail & Related papers (2020-06-13T06:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.