Copula Density Neural Estimation
- URL: http://arxiv.org/abs/2211.15353v3
- Date: Tue, 08 Jul 2025 08:43:36 GMT
- Title: Copula Density Neural Estimation
- Authors: Nunzio A. Letizia, Nicola Novello, Andrea M. Tonello,
- Abstract summary: We focus on the problem of estimating the copula density associated to any observed data.<n>We use a neural network-based method referred to as copula density neural estimation (CODINE)<n>Results show that the novel learning approach is capable of modeling complex distributions and can be applied for mutual information estimation and data generation.
- Score: 6.43826005042477
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Probability density estimation from observed data constitutes a central task in statistics. In this brief, we focus on the problem of estimating the copula density associated to any observed data, as it fully describes the dependence between random variables. We separate univariate marginal distributions from the joint dependence structure in the data, the copula itself, and we model the latter with a neural network-based method referred to as copula density neural estimation (CODINE). Results show that the novel learning approach is capable of modeling complex distributions and can be applied for mutual information estimation and data generation.
Related papers
- Continuous Temporal Learning of Probability Distributions via Neural ODEs with Applications in Continuous Glucose Monitoring Data [0.0]
The goal is to analyze how the distribution of a biomarker, such as glucose, changes over time and how these changes may reflect the progression of chronic diseases like diabetes.<n>We introduce a probabilistic model based on a Gaussian mixture that captures the evolution of a continuous-time process.<n>Our approach combines a non-parametric estimate of the distribution, obtained with Mean Discrepancy (MMD), and a Neural Ordinary Differential Equation (Neural ODE) that governs the temporal evolution of the mixture weights.
arXiv Detail & Related papers (2025-05-13T15:57:06Z) - Your copula is a classifier in disguise: classification-based copula density estimation [2.5261465733373965]
We propose reinterpreting copula density estimation as a discriminative task.<n>We derive equivalences between well-known copula classes and classification problems naturally arising in our interpretation.<n>We show our estimator achieves theoretical guarantees akin to maximum likelihood estimation.
arXiv Detail & Related papers (2024-11-05T11:25:34Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Empirical Density Estimation based on Spline Quasi-Interpolation with
applications to Copulas clustering modeling [0.0]
Density estimation is a fundamental technique employed in various fields to model and to understand the underlying distribution of data.
In this paper we propose the mono-variate approximation of the density using quasi-interpolation.
The presented algorithm is validated on artificial and real datasets.
arXiv Detail & Related papers (2024-02-18T11:49:38Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - Squared Neural Families: A New Class of Tractable Density Models [23.337256081314518]
We develop and investigate a new class of probability distributions, which we call a Squared Neural Family (SNEFY)
We show that SNEFYs admit closed form normalising constants in many cases of interest, thereby resulting in flexible yet fully tractable density models.
Their utility is illustrated on a variety of density estimation, conditional density estimation, and density estimation with missing data tasks.
arXiv Detail & Related papers (2023-05-22T23:56:11Z) - Copula-Based Density Estimation Models for Multivariate Zero-Inflated
Continuous Data [0.0]
We propose two copula-based density estimation models that can cope with multivariate correlation among zero-inflated continuous variables.
In order to overcome the difficulty in the use of copulas due to the tied-data problem in zero-inflated data, we propose a new type of copula, rectified Gaussian copula.
arXiv Detail & Related papers (2023-04-02T13:43:37Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Learning to Bound Counterfactual Inference in Structural Causal Models
from Observational and Randomised Data [64.96984404868411]
We derive a likelihood characterisation for the overall data that leads us to extend a previous EM-based algorithm.
The new algorithm learns to approximate the (unidentifiability) region of model parameters from such mixed data sources.
It delivers interval approximations to counterfactual results, which collapse to points in the identifiable case.
arXiv Detail & Related papers (2022-12-06T12:42:11Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Deep Data Density Estimation through Donsker-Varadhan Representation [5.276937617129594]
We present a simple yet effective method for estimating the data density using a deep neural network and the Donsker-Varadhan variational lower bound on the KL divergence.
We show that the optimal critic function associated with the Donsker-Varadhan representation on the divergence between the data and the uniform distribution can estimate the data density.
arXiv Detail & Related papers (2021-04-14T03:38:32Z) - Deep Archimedean Copulas [98.96141706464425]
ACNet is a novel differentiable neural network architecture that enforces structural properties.
We show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.
arXiv Detail & Related papers (2020-12-05T22:58:37Z) - Semi-Structured Deep Piecewise Exponential Models [2.7728956081909346]
We propose a versatile framework for survival analysis that combines advanced concepts from statistics with deep learning.
A proof of concept is provided by using the framework to predict Alzheimer's disease progression.
arXiv Detail & Related papers (2020-11-11T14:41:19Z) - Neural Approximate Sufficient Statistics for Implicit Models [34.44047460667847]
We frame the task of constructing sufficient statistics as learning mutual information maximizing representations of the data with the help of deep neural networks.
We apply our approach to both traditional approximate Bayesian computation and recent neural likelihood methods, boosting their performance on a range of tasks.
arXiv Detail & Related papers (2020-10-20T07:11:40Z) - Latent Network Structure Learning from High Dimensional Multivariate
Point Processes [5.079425170410857]
We propose a new class of nonstationary Hawkes processes to characterize the complex processes underlying the observed data.
We estimate the latent network structure using an efficient sparse least squares estimation approach.
We demonstrate the efficacy of our proposed method through simulation studies and an application to a neuron spike train data set.
arXiv Detail & Related papers (2020-04-07T17:48:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.