Learning Functional Graphs with Nonlinear Sufficient Dimension Reduction
- URL: http://arxiv.org/abs/2601.15696v1
- Date: Thu, 22 Jan 2026 06:48:37 GMT
- Title: Learning Functional Graphs with Nonlinear Sufficient Dimension Reduction
- Authors: Kyongwon Kim, Bing Li,
- Abstract summary: We introduce a nonparametric functional graphical model based on functional sufficient dimension reduction.<n>Our method not only relaxes the Gaussian or copula Gaussian assumptions, but also enhances estimation accuracy.
- Score: 6.717732591785908
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Functional graphical models have undergone extensive development during the recent years, leading to a variety models such as the functional Gaussian graphical model, the functional copula Gaussian graphical model, the functional Bayesian graphical model, the nonparametric functional additive graphical model, and the conditional functional graphical model. These models rely either on some parametric form of distributions on random functions, or on additive conditional independence, a criterion that is different from probabilistic conditional independence. In this paper we introduce a nonparametric functional graphical model based on functional sufficient dimension reduction. Our method not only relaxes the Gaussian or copula Gaussian assumptions, but also enhances estimation accuracy by avoiding the ``curse of dimensionality''. Moreover, it retains the probabilistic conditional independence as the criterion to determine the absence of edges. By doing simulation study and analysis of the f-MRI dataset, we demonstrate the advantages of our method.
Related papers
- Bayesian Kernel Regression for Functional Data [1.4501446815590895]
In supervised learning, the output variable to be predicted is often represented as a function.<n>We propose a novel functional output regression model based on kernel methods.
arXiv Detail & Related papers (2025-03-17T19:28:27Z) - On Sufficient Graphical Models [4.279157560953137]
We introduce a sufficient graphical model by applying the recently developed nonlinear sufficient dimension reduction techniques.
We develop the population-level properties, convergence rate, and variable selection consistency of our estimate.
arXiv Detail & Related papers (2023-07-10T05:30:14Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - An additive graphical model for discrete data [6.821476515155997]
We introduce a nonparametric graphical model for discrete node variables based on additive conditional independence.
We exploit the properties of discrete random variables to uncover a deeper relation between additive conditional independence and conditional independence than previously known.
arXiv Detail & Related papers (2021-12-29T17:48:12Z) - Learning PSD-valued functions using kernel sums-of-squares [94.96262888797257]
We introduce a kernel sum-of-squares model for functions that take values in the PSD cone.
We show that it constitutes a universal approximator of PSD functions, and derive eigenvalue bounds in the case of subsampled equality constraints.
We then apply our results to modeling convex functions, by enforcing a kernel sum-of-squares representation of their Hessian.
arXiv Detail & Related papers (2021-11-22T16:07:50Z) - High-dimensional Functional Graphical Model Structure Learning via
Neighborhood Selection Approach [15.334392442475115]
We propose a neighborhood selection approach to estimate the structure of functional graphical models.
We thus circumvent the need for a well-defined precision operator that may not exist when the functions are infinite dimensional.
arXiv Detail & Related papers (2021-05-06T07:38:50Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Non-parametric Models for Non-negative Functions [48.7576911714538]
We provide the first model for non-negative functions from the same good linear models.
We prove that it admits a representer theorem and provide an efficient dual formulation for convex problems.
arXiv Detail & Related papers (2020-07-08T07:17:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.