Expanding Expressiveness of Diffusion Models with Limited Data via
Self-Distillation based Fine-Tuning
- URL: http://arxiv.org/abs/2311.01018v1
- Date: Thu, 2 Nov 2023 06:24:06 GMT
- Title: Expanding Expressiveness of Diffusion Models with Limited Data via
Self-Distillation based Fine-Tuning
- Authors: Jiwan Hur, Jaehyun Choi, Gyojin Han, Dong-Jae Lee, and Junmo Kim
- Abstract summary: Training diffusion models on limited datasets poses challenges in terms of limited generation capacity and expressiveness.
We propose Self-Distillation for Fine-Tuning diffusion models (SDFT) to address these challenges.
- Score: 24.791783885165923
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Training diffusion models on limited datasets poses challenges in terms of
limited generation capacity and expressiveness, leading to unsatisfactory
results in various downstream tasks utilizing pretrained diffusion models, such
as domain translation and text-guided image manipulation. In this paper, we
propose Self-Distillation for Fine-Tuning diffusion models (SDFT), a
methodology to address these challenges by leveraging diverse features from
diffusion models pretrained on large source datasets. SDFT distills more
general features (shape, colors, etc.) and less domain-specific features
(texture, fine details, etc) from the source model, allowing successful
knowledge transfer without disturbing the training process on target datasets.
The proposed method is not constrained by the specific architecture of the
model and thus can be generally adopted to existing frameworks. Experimental
results demonstrate that SDFT enhances the expressiveness of the diffusion
model with limited datasets, resulting in improved generation capabilities
across various downstream tasks.
Related papers
- Constrained Diffusion Models via Dual Training [80.03953599062365]
We develop constrained diffusion models based on desired distributions informed by requirements.
We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints.
arXiv Detail & Related papers (2024-08-27T14:25:42Z) - Learning Differentially Private Diffusion Models via Stochastic Adversarial Distillation [20.62325580203137]
We introduce DP-SAD, which trains a private diffusion model by an adversarial distillation method.
For better generation quality, we introduce a discriminator to distinguish whether an image is from the teacher or the student.
arXiv Detail & Related papers (2024-08-27T02:29:29Z) - Model-Based Diffusion for Trajectory Optimization [8.943418808959494]
We introduce Model-Based Diffusion (MBD), an optimization approach using the diffusion process to solve trajectory optimization (TO) problems without data.
Although MBD does not require external data, it can be naturally integrated with data of diverse qualities to steer the diffusion process.
MBD outperforms state-of-the-art reinforcement learning and sampling-based TO methods in challenging contact-rich tasks.
arXiv Detail & Related papers (2024-05-28T22:14:25Z) - Neural Flow Diffusion Models: Learnable Forward Process for Improved Diffusion Modelling [2.1779479916071067]
We introduce a novel framework that enhances diffusion models by supporting a broader range of forward processes.
We also propose a novel parameterization technique for learning the forward process.
Results underscore NFDM's versatility and its potential for a wide range of applications.
arXiv Detail & Related papers (2024-04-19T15:10:54Z) - Distribution-Aware Data Expansion with Diffusion Models [55.979857976023695]
We propose DistDiff, a training-free data expansion framework based on the distribution-aware diffusion model.
DistDiff consistently enhances accuracy across a diverse range of datasets compared to models trained solely on original data.
arXiv Detail & Related papers (2024-03-11T14:07:53Z) - MG-TSD: Multi-Granularity Time Series Diffusion Models with Guided Learning Process [26.661721555671626]
We introduce a novel Multi-Granularity Time Series (MG-TSD) model, which achieves state-of-the-art predictive performance.
Our approach does not rely on additional external data, making it versatile and applicable across various domains.
arXiv Detail & Related papers (2024-03-09T01:15:03Z) - Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - Self-Play Fine-Tuning of Diffusion Models for Text-to-Image Generation [59.184980778643464]
Fine-tuning Diffusion Models remains an underexplored frontier in generative artificial intelligence (GenAI)
In this paper, we introduce an innovative technique called self-play fine-tuning for diffusion models (SPIN-Diffusion)
Our approach offers an alternative to conventional supervised fine-tuning and RL strategies, significantly improving both model performance and alignment.
arXiv Detail & Related papers (2024-02-15T18:59:18Z) - Diff-Instruct: A Universal Approach for Transferring Knowledge From
Pre-trained Diffusion Models [77.83923746319498]
We propose a framework called Diff-Instruct to instruct the training of arbitrary generative models.
We show that Diff-Instruct results in state-of-the-art single-step diffusion-based models.
Experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models.
arXiv Detail & Related papers (2023-05-29T04:22:57Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.