Learning Inconsistent Preferences with Gaussian Processes
- URL: http://arxiv.org/abs/2006.03847v3
- Date: Thu, 27 Jan 2022 21:15:52 GMT
- Title: Learning Inconsistent Preferences with Gaussian Processes
- Authors: Siu Lun Chau, Javier Gonz\'alez, Dino Sejdinovic
- Abstract summary: We revisit widely used preferential Gaussian processes by Chu et al.(2005) and challenge their modelling assumption that imposes rankability of data items via latent utility function values.
We propose a generalisation of pgp which can capture more expressive latent preferential structures in the data.
Our experimental findings support the conjecture that violations of rankability are ubiquitous in real-world preferential data.
- Score: 14.64963271587818
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We revisit widely used preferential Gaussian processes by Chu et al.(2005)
and challenge their modelling assumption that imposes rankability of data items
via latent utility function values. We propose a generalisation of pgp which
can capture more expressive latent preferential structures in the data and thus
be used to model inconsistent preferences, i.e. where transitivity is violated,
or to discover clusters of comparable items via spectral decomposition of the
learned preference functions. We also consider the properties of associated
covariance kernel functions and its reproducing kernel Hilbert Space (RKHS),
giving a simple construction that satisfies universality in the space of
preference functions. Finally, we provide an extensive set of numerical
experiments on simulated and real-world datasets showcasing the competitiveness
of our proposed method with state-of-the-art. Our experimental findings support
the conjecture that violations of rankability are ubiquitous in real-world
preferential data.
Related papers
- Functional Linear Non-Gaussian Acyclic Model for Causal Discovery [7.303542369216906]
We develop a framework to identify causal relationships in brain-effective connectivity tasks involving fMRI and EEG datasets.
We establish theoretical guarantees of the identifiability of the causal relationship among non-Gaussian random vectors and even random functions in infinite-dimensional Hilbert spaces.
For real data, we focus on analyzing the brain connectivity patterns derived from fMRI data.
arXiv Detail & Related papers (2024-01-17T23:27:48Z) - Manifold Learning with Sparse Regularised Optimal Transport [0.17205106391379024]
Real-world datasets are subject to noisy observations and sampling, so that distilling information about the underlying manifold is a major challenge.
We propose a method for manifold learning that utilises a symmetric version of optimal transport with a quadratic regularisation.
We prove that the resulting kernel is consistent with a Laplace-type operator in the continuous limit, establish robustness to heteroskedastic noise and exhibit these results in simulations.
arXiv Detail & Related papers (2023-07-19T08:05:46Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Data-Driven Reachability analysis and Support set Estimation with
Christoffel Functions [8.183446952097528]
We present algorithms for estimating the forward reachable set of a dynamical system.
The produced estimate is the sublevel set of a function called an empirical inverse Christoffel function.
In addition to reachability analysis, the same approach can be applied to general problems of estimating the support of a random variable.
arXiv Detail & Related papers (2021-12-18T20:25:34Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - High-dimensional Functional Graphical Model Structure Learning via
Neighborhood Selection Approach [15.334392442475115]
We propose a neighborhood selection approach to estimate the structure of functional graphical models.
We thus circumvent the need for a well-defined precision operator that may not exist when the functions are infinite dimensional.
arXiv Detail & Related papers (2021-05-06T07:38:50Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - Fuzzy Integral = Contextual Linear Order Statistic [0.0]
The fuzzy integral is a powerful parametric nonlin-ear function with utility in a wide range of applications.
We show that it can be represented by a set of contextual linear order statistics.
arXiv Detail & Related papers (2020-07-06T16:37:36Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.