Latent Variable Multi-output Gaussian Processes for Hierarchical
Datasets
- URL: http://arxiv.org/abs/2308.16822v1
- Date: Thu, 31 Aug 2023 15:52:35 GMT
- Title: Latent Variable Multi-output Gaussian Processes for Hierarchical
Datasets
- Authors: Chunchao Ma, Arthur Leroy, Mauricio Alvarez
- Abstract summary: Multi-output Gaussian processes (MOGPs) have been introduced to deal with multiple tasks by exploiting the correlations between different outputs.
This paper proposes an extension of MOGPs for hierarchical datasets.
- Score: 0.8057006406834466
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Multi-output Gaussian processes (MOGPs) have been introduced to deal with
multiple tasks by exploiting the correlations between different outputs.
Generally, MOGPs models assume a flat correlation structure between the
outputs. However, such a formulation does not account for more elaborate
relationships, for instance, if several replicates were observed for each
output (which is a typical setting in biological experiments). This paper
proposes an extension of MOGPs for hierarchical datasets (i.e. datasets for
which the relationships between observations can be represented within a tree
structure). Our model defines a tailored kernel function accounting for
hierarchical structures in the data to capture different levels of correlations
while leveraging the introduction of latent variables to express the underlying
dependencies between outputs through a dedicated kernel. This latter feature is
expected to significantly improve scalability as the number of tasks increases.
An extensive experimental study involving both synthetic and real-world data
from genomics and motion capture is proposed to support our claims.
Related papers
- Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - iSCAN: Identifying Causal Mechanism Shifts among Nonlinear Additive
Noise Models [48.33685559041322]
This paper focuses on identifying the causal mechanism shifts in two or more related datasets over the same set of variables.
Code implementing the proposed method is open-source and publicly available at https://github.com/kevinsbello/iSCAN.
arXiv Detail & Related papers (2023-06-30T01:48:11Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Factorized Fusion Shrinkage for Dynamic Relational Data [16.531262817315696]
We consider a factorized fusion shrinkage model in which all decomposed factors are dynamically shrunk towards group-wise fusion structures.
The proposed priors enjoy many favorable properties in comparison and clustering of the estimated dynamic latent factors.
We present a structured mean-field variational inference framework that balances optimal posterior inference with computational scalability.
arXiv Detail & Related papers (2022-09-30T21:03:40Z) - Amortised Inference in Structured Generative Models with Explaining Away [16.92791301062903]
We extend the output of amortised variational inference to incorporate structured factors over multiple variables.
We show that appropriately parameterised factors can be combined efficiently with variational message passing in elaborate graphical structures.
We then fit the structured model to high-dimensional neural spiking time-series from the hippocampus of freely moving rodents.
arXiv Detail & Related papers (2022-09-12T12:52:15Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - BCDAG: An R package for Bayesian structure and Causal learning of
Gaussian DAGs [77.34726150561087]
We introduce the R package for causal discovery and causal effect estimation from observational data.
Our implementation scales efficiently with the number of observations and, whenever the DAGs are sufficiently sparse, the number of variables in the dataset.
We then illustrate the main functions and algorithms on both real and simulated datasets.
arXiv Detail & Related papers (2022-01-28T09:30:32Z) - Scalable Gaussian Processes for Data-Driven Design using Big Data with
Categorical Factors [14.337297795182181]
Gaussian processes (GP) have difficulties in accommodating big datasets, categorical inputs, and multiple responses.
We propose a GP model that utilizes latent variables and functions obtained through variational inference to address the aforementioned challenges simultaneously.
Our approach is demonstrated for machine learning of ternary oxide materials and topology optimization of a multiscale compliant mechanism.
arXiv Detail & Related papers (2021-06-26T02:17:23Z) - Multi-task Causal Learning with Gaussian Processes [17.205106391379026]
This paper studies the problem of learning the correlation structure of a set of intervention functions defined on the directed acyclic graph (DAG) of a causal model.
We propose the first multi-task causal Gaussian process (GP) model, which allows for information sharing across continuous interventions and experiments on different variables.
arXiv Detail & Related papers (2020-09-27T11:33:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.