Distance Metric Learning through Minimization of the Free Energy
- URL: http://arxiv.org/abs/2106.05495v1
- Date: Thu, 10 Jun 2021 04:54:25 GMT
- Title: Distance Metric Learning through Minimization of the Free Energy
- Authors: Dusan Stosic, Darko Stosic, Teresa B. Ludermir, Borko Stosic
- Abstract summary: We present a simple approach based on concepts from statistical physics to learn optimal distance metric for a given problem.
Much like for many problems in physics, we propose an approach based on Metropolis Monte Carlo to find the best distance metric.
Our proposed method can handle a wide variety of constraints including those with spurious local minima.
- Score: 0.825845106786193
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Distance metric learning has attracted a lot of interest for solving machine
learning and pattern recognition problems over the last decades. In this work
we present a simple approach based on concepts from statistical physics to
learn optimal distance metric for a given problem. We formulate the task as a
typical statistical physics problem: distances between patterns represent
constituents of a physical system and the objective function corresponds to
energy. Then we express the problem as a minimization of the free energy of a
complex system, which is equivalent to distance metric learning. Much like for
many problems in physics, we propose an approach based on Metropolis Monte
Carlo to find the best distance metric. This provides a natural way to learn
the distance metric, where the learning process can be intuitively seen as
stretching and rotating the metric space until some heuristic is satisfied. Our
proposed method can handle a wide variety of constraints including those with
spurious local minima. The approach works surprisingly well with stochastic
nearest neighbors from neighborhood component analysis (NCA). Experimental
results on artificial and real-world data sets reveal a clear superiority over
a number of state-of-the-art distance metric learning methods for nearest
neighbors classification.
Related papers
- Computing distances and means on manifolds with a metric-constrained Eikonal approach [4.266376725904727]
We introduce the metric-constrained Eikonal solver to obtain continuous, differentiable representations of distance functions.
The differentiable nature of these representations allows for the direct computation of globally length-minimising paths on the manifold.
arXiv Detail & Related papers (2024-04-12T18:26:32Z) - Approximation Theory, Computing, and Deep Learning on the Wasserstein Space [0.5735035463793009]
We address the challenge of approximating functions in infinite-dimensional spaces from finite samples.
Our focus is on the Wasserstein distance function, which serves as a relevant example.
We adopt three machine learning-based approaches to define functional approximants.
arXiv Detail & Related papers (2023-10-30T13:59:47Z) - Vanishing Point Estimation in Uncalibrated Images with Prior Gravity
Direction [82.72686460985297]
We tackle the problem of estimating a Manhattan frame.
We derive two new 2-line solvers, one of which does not suffer from singularities affecting existing solvers.
We also design a new non-minimal method, running on an arbitrary number of lines, to boost the performance in local optimization.
arXiv Detail & Related papers (2023-08-21T13:03:25Z) - Learning the solution operator of two-dimensional incompressible
Navier-Stokes equations using physics-aware convolutional neural networks [68.8204255655161]
We introduce a technique with which it is possible to learn approximate solutions to the steady-state Navier--Stokes equations in varying geometries without the need of parametrization.
The results of our physics-aware CNN are compared to a state-of-the-art data-based approach.
arXiv Detail & Related papers (2023-08-04T05:09:06Z) - Learning governing physics from output only measurements [0.0]
We propose a novel framework for learning governing physics of dynamical system from output only measurements.
In particular, we combine sparsity promoting spike and slab prior, Bayes law, and Euler Maruyama scheme to identify the governing physics from data.
arXiv Detail & Related papers (2022-08-11T02:24:03Z) - Neural Bregman Divergences for Distance Learning [60.375385370556145]
We propose a new approach to learning arbitrary Bregman divergences in a differentiable manner via input convex neural networks.
We show that our method more faithfully learns divergences over a set of both new and previously studied tasks.
Our tests further extend to known asymmetric, but non-Bregman tasks, where our method still performs competitively despite misspecification.
arXiv Detail & Related papers (2022-06-09T20:53:15Z) - Fast Uncertainty Quantification for Deep Object Pose Estimation [91.09217713805337]
Deep learning-based object pose estimators are often unreliable and overconfident.
In this work, we propose a simple, efficient, and plug-and-play UQ method for 6-DoF object pose estimation.
arXiv Detail & Related papers (2020-11-16T06:51:55Z) - Provably Robust Metric Learning [98.50580215125142]
We show that existing metric learning algorithms can result in metrics that are less robust than the Euclidean distance.
We propose a novel metric learning algorithm to find a Mahalanobis distance that is robust against adversarial perturbations.
Experimental results show that the proposed metric learning algorithm improves both certified robust errors and empirical robust errors.
arXiv Detail & Related papers (2020-06-12T09:17:08Z) - Towards Certified Robustness of Distance Metric Learning [53.96113074344632]
We advocate imposing an adversarial margin in the input space so as to improve the generalization and robustness of metric learning algorithms.
We show that the enlarged margin is beneficial to the generalization ability by using the theoretical technique of algorithmic robustness.
arXiv Detail & Related papers (2020-06-10T16:51:53Z) - Project and Forget: Solving Large-Scale Metric Constrained Problems [7.381113319198104]
Given a set of dissimilarity measurements amongst data points, determining what metric representation is most "consistent" with the input measurements is a key step in many machine learning algorithms.
Existing methods are restricted to specific kinds of metrics or small problem sizes because of the large number of metric constraints in such problems.
In this paper, we provide an active set, Project and Forget, that uses Bregman projections, to solve metric constrained problems with many (possibly exponentially) inequality constraints.
arXiv Detail & Related papers (2020-05-08T04:50:54Z) - Metric Learning for Ordered Labeled Trees with pq-grams [11.284638114256712]
We propose a new metric learning approach for tree-structured data with pq-grams.
The pq-gram distance is a distance for ordered labeled trees, and has much lower computation cost than the tree edit distance.
We empirically show that the proposed approach achieves competitive results with the state-of-the-art edit distance-based methods.
arXiv Detail & Related papers (2020-03-09T08:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.