Simplicial Gaussian Models: Representation and Inference
- URL: http://arxiv.org/abs/2510.12983v1
- Date: Tue, 14 Oct 2025 20:51:56 GMT
- Title: Simplicial Gaussian Models: Representation and Inference
- Authors: Lorenzo Marinucci, Gabriele D'Acunto, Paolo Di Lorenzo, Sergio Barbarossa,
- Abstract summary: We propose the simplicial Gaussian model (SGM), which extends Gaussian PGM to simplicial complexes.<n>Our model builds upon discrete Hodge theory and incorporates uncertainty at every topological level through independent random components.<n>Motivated by applications, we focus on the marginal edge-level distribution while treating node- and triangle-level variables as latent.
- Score: 13.687470962704744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic graphical models (PGMs) are powerful tools for representing statistical dependencies through graphs in high-dimensional systems. However, they are limited to pairwise interactions. In this work, we propose the simplicial Gaussian model (SGM), which extends Gaussian PGM to simplicial complexes. SGM jointly models random variables supported on vertices, edges, and triangles, within a single parametrized Gaussian distribution. Our model builds upon discrete Hodge theory and incorporates uncertainty at every topological level through independent random components. Motivated by applications, we focus on the marginal edge-level distribution while treating node- and triangle-level variables as latent. We then develop a maximum-likelihood inference algorithm to recover the parameters of the full SGM and the induced conditional dependence structure. Numerical experiments on synthetic simplicial complexes with varying size and sparsity confirm the effectiveness of our algorithm.
Related papers
- When does Gaussian equivalence fail and how to fix it: Non-universal behavior of random features with quadratic scaling [15.148577493784051]
Gaussian equivalence theory (GET) states that the behavior of high-dimensional, complex features can be captured by Gaussian surrogates.<n>But numerical experiments show that this equivalence can fail even for simple embeddings under general scaling regimes.<n>We introduce a Conditional Equivalent (CGE) model, which can be viewed as appending a low-dimensional non-Gaussian component to an otherwise high-dimensional Gaussian model.
arXiv Detail & Related papers (2025-12-03T00:23:12Z) - Pivotal CLTs for Pseudolikelihood via Conditional Centering in Dependent Random Fields [1.3875545441867139]
We study fluctuations of conditionally centered statistics of the form $$N-1/2sum_i=1N c_i(g(sigma_i)-mathbbE_N[g(sigma_i)|sigma_j,jneq i])$$ where $(sigma_j,ldots,sigma_N) are sampled from a dependent random field.<n>We develop a general framework for maximum pseudolikelihood inference in dependent random fields.
arXiv Detail & Related papers (2025-10-06T16:06:45Z) - Wasserstein Convergence of Score-based Generative Models under Semiconvexity and Discontinuous Gradients [3.007949058551534]
Score-based Generative Models (SGMs) approximate a data distribution by perturbing it with Gaussian noise and subsequently denoising it via a learned diffusion process.<n>We establish the first non-asymotic Wasserstein-2 convergence guarantees for SGMs targeting semi-one order with potentially discontinuous gradients.
arXiv Detail & Related papers (2025-05-06T11:17:15Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Simplex Random Features [53.97976744884616]
We present Simplex Random Features (SimRFs), a new random feature (RF) mechanism for unbiased approximation of the softmax and Gaussian kernels.
We prove that SimRFs provide the smallest possible mean square error (MSE) on unbiased estimates of these kernels.
We show consistent gains provided by SimRFs in settings including pointwise kernel estimation, nonparametric classification and scalable Transformers.
arXiv Detail & Related papers (2023-01-31T18:53:39Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Gaussian Process Latent Class Choice Models [7.992550355579791]
We present a non-parametric class of probabilistic machine learning within discrete choice models (DCMs)
The proposed model would assign individuals probabilistically to behaviorally homogeneous clusters (latent classes) using GPs.
The model is tested on two different mode choice applications and compared against different LCCM benchmarks.
arXiv Detail & Related papers (2021-01-28T19:56:42Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Generalized Sliced Distances for Probability Distributions [47.543990188697734]
We introduce a broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs)
GSPMs are rooted in the generalized Radon transform and come with a unique geometric interpretation.
We consider GSPM-based gradient flows for generative modeling applications and show that under mild assumptions, the gradient flow converges to the global optimum.
arXiv Detail & Related papers (2020-02-28T04:18:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.