Diffusion models for missing value imputation in tabular data
- URL: http://arxiv.org/abs/2210.17128v1
- Date: Mon, 31 Oct 2022 08:13:26 GMT
- Title: Diffusion models for missing value imputation in tabular data
- Authors: Shuhan Zheng, Nontawat Charoenphakdee
- Abstract summary: Missing value imputation in machine learning is the task of estimating the missing values in the dataset accurately using available information.
We propose a diffusion model approach called "Conditional Score-based Diffusion Models for Tabular data" (CSDI_T)
To effectively handle categorical variables and numerical variables simultaneously, we investigate three techniques: one-hot encoding, analog bits encoding, and feature tokenization.
- Score: 10.599563005836066
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Missing value imputation in machine learning is the task of estimating the
missing values in the dataset accurately using available information. In this
task, several deep generative modeling methods have been proposed and
demonstrated their usefulness, e.g., generative adversarial imputation
networks. Recently, diffusion models have gained popularity because of their
effectiveness in the generative modeling task in images, texts, audio, etc. To
our knowledge, less attention has been paid to the investigation of the
effectiveness of diffusion models for missing value imputation in tabular data.
Based on recent development of diffusion models for time-series data
imputation, we propose a diffusion model approach called "Conditional
Score-based Diffusion Models for Tabular data" (CSDI_T). To effectively handle
categorical variables and numerical variables simultaneously, we investigate
three techniques: one-hot encoding, analog bits encoding, and feature
tokenization. Experimental results on benchmark datasets demonstrated the
effectiveness of CSDI_T compared with well-known existing methods, and also
emphasized the importance of the categorical embedding techniques.
Related papers
- Diffusion Attribution Score: Evaluating Training Data Influence in Diffusion Model [22.39558434131574]
Existing data attribution methods for diffusion models typically quantify the contribution of a training sample.
We argue that the direct usage of diffusion loss cannot represent such a contribution accurately due to the calculation of diffusion loss.
We aim to measure the direct comparison between predicted distributions with an attribution score to analyse the training sample importance.
arXiv Detail & Related papers (2024-10-24T10:58:17Z) - Influence Functions for Scalable Data Attribution in Diffusion Models [52.92223039302037]
Diffusion models have led to significant advancements in generative modelling.
Yet their widespread adoption poses challenges regarding data attribution and interpretability.
In this paper, we aim to help address such challenges by developing an textitinfluence functions framework.
arXiv Detail & Related papers (2024-10-17T17:59:02Z) - Self-Supervision Improves Diffusion Models for Tabular Data Imputation [20.871219616589986]
This paper introduces an advanced diffusion model named Self-supervised imputation Diffusion Model (SimpDM for brevity)
To mitigate sensitivity to noise, we introduce a self-supervised alignment mechanism that aims to regularize the model, ensuring consistent and stable imputation predictions.
We also introduce a carefully devised state-dependent data augmentation strategy within SimpDM, enhancing the robustness of the diffusion model when dealing with limited data.
arXiv Detail & Related papers (2024-07-25T13:06:30Z) - DetDiffusion: Synergizing Generative and Perceptive Models for Enhanced Data Generation and Perception [78.26734070960886]
Current perceptive models heavily depend on resource-intensive datasets.
We introduce perception-aware loss (P.A. loss) through segmentation, improving both quality and controllability.
Our method customizes data augmentation by extracting and utilizing perception-aware attribute (P.A. Attr) during generation.
arXiv Detail & Related papers (2024-03-20T04:58:03Z) - Data Attribution for Diffusion Models: Timestep-induced Bias in Influence Estimation [53.27596811146316]
Diffusion models operate over a sequence of timesteps instead of instantaneous input-output relationships in previous contexts.
We present Diffusion-TracIn that incorporates this temporal dynamics and observe that samples' loss gradient norms are highly dependent on timestep.
We introduce Diffusion-ReTrac as a re-normalized adaptation that enables the retrieval of training samples more targeted to the test sample of interest.
arXiv Detail & Related papers (2024-01-17T07:58:18Z) - Diffusion-Model-Assisted Supervised Learning of Generative Models for
Density Estimation [10.793646707711442]
We present a framework for training generative models for density estimation.
We use the score-based diffusion model to generate labeled data.
Once the labeled data are generated, we can train a simple fully connected neural network to learn the generative model in the supervised manner.
arXiv Detail & Related papers (2023-10-22T23:56:19Z) - Deep Networks as Denoising Algorithms: Sample-Efficient Learning of
Diffusion Models in High-Dimensional Graphical Models [22.353510613540564]
We investigate the approximation efficiency of score functions by deep neural networks in generative modeling.
We observe score functions can often be well-approximated in graphical models through variational inference denoising algorithms.
We provide an efficient sample complexity bound for diffusion-based generative modeling when the score function is learned by deep neural networks.
arXiv Detail & Related papers (2023-09-20T15:51:10Z) - MissDiff: Training Diffusion Models on Tabular Data with Missing Values [29.894691645801597]
This work presents a unified and principled diffusion-based framework for learning from data with missing values.
We first observe that the widely adopted "impute-then-generate" pipeline may lead to a biased learning objective.
We prove the proposed method is consistent in learning the score of data distributions, and the proposed training objective serves as an upper bound for the negative likelihood in certain cases.
arXiv Detail & Related papers (2023-07-02T03:49:47Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z) - CSDI: Conditional Score-based Diffusion Models for Probabilistic Time
Series Imputation [107.63407690972139]
Conditional Score-based Diffusion models for Imputation (CSDI) is a novel time series imputation method that utilizes score-based diffusion models conditioned on observed data.
CSDI improves by 40-70% over existing probabilistic imputation methods on popular performance metrics.
In addition, C reduces the error by 5-20% compared to the state-of-the-art deterministic imputation methods.
arXiv Detail & Related papers (2021-07-07T22:20:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.