DiffPattern-Flex: Efficient Layout Pattern Generation via Discrete Diffusion
- URL: http://arxiv.org/abs/2505.04173v1
- Date: Wed, 07 May 2025 07:04:11 GMT
- Title: DiffPattern-Flex: Efficient Layout Pattern Generation via Discrete Diffusion
- Authors: Zixiao Wang, Wenqian Zhao, Yunheng Shen, Yang Bai, Guojin Chen, Farzan Farnia, Bei Yu,
- Abstract summary: We present toolDiffPattern-Flex, a novel approach to generate reliable layout patterns efficiently.<n>toolDiffPattern-Flex incorporates a new method for generating diverse topologies using a discrete diffusion model.<n>Fast sampling and efficient legalization technologies are employed to accelerate the generation process.
- Score: 19.004533761566197
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advancements in layout pattern generation have been dominated by deep generative models. However, relying solely on neural networks for legality guarantees raises concerns in many practical applications. In this paper, we present \tool{DiffPattern}-Flex, a novel approach designed to generate reliable layout patterns efficiently. \tool{DiffPattern}-Flex incorporates a new method for generating diverse topologies using a discrete diffusion model while maintaining a lossless and compute-efficient layout representation. To ensure legal pattern generation, we employ {an} optimization-based, white-box pattern assessment process based on specific design rules. Furthermore, fast sampling and efficient legalization technologies are employed to accelerate the generation process. Experimental results across various benchmarks demonstrate that \tool{DiffPattern}-Flex significantly outperforms existing methods and excels at producing reliable layout patterns.
Related papers
- Reward-Guided Iterative Refinement in Diffusion Models at Test-Time with Applications to Protein and DNA Design [87.58981407469977]
We propose a novel framework for inference-time reward optimization with diffusion models inspired by evolutionary algorithms.<n>Our approach employs an iterative refinement process consisting of two steps in each iteration: noising and reward-guided denoising.
arXiv Detail & Related papers (2025-02-20T17:48:45Z) - Structured Pattern Expansion with Diffusion Models [6.726377308248659]
Recent advances in diffusion models have significantly improved the synthesis of materials, textures, and 3D shapes.
In this paper, we address the synthesis of structured, stationary patterns, where diffusion models are generally less reliable and, more importantly, less controllable.
It enables users to exercise direct control over the synthesis by expanding a partially hand-drawn pattern into a larger design while preserving the structure and details of the input.
arXiv Detail & Related papers (2024-11-12T18:39:23Z) - DeFoG: Discrete Flow Matching for Graph Generation [45.037260759871124]
We introduce DeFoG, a graph generative framework that disentangles sampling from training.<n>We propose novel sampling methods that significantly enhance performance and reduce the required number of refinement steps.
arXiv Detail & Related papers (2024-10-05T18:52:54Z) - Variational Search Distributions [16.609027794680213]
We develop VSD, a method for conditioning a generative model of discrete, variational designs on a rare desired class.<n>We empirically demonstrate that VSD can outperform existing baseline methods on a set of real sequence-design problems.
arXiv Detail & Related papers (2024-09-10T01:33:31Z) - PatternPaint: Practical Layout Pattern Generation Using Diffusion-Based Inpainting [4.9540362281086265]
PatternPaint is a diffusion-based framework capable of generating legal patterns with limited design-rule-compliant training samples.<n>Our model is the only one that can generate legal patterns in complex 2D metal interconnect design rule settings.<n>As a result, we demonstrate a production-ready approach for layout pattern generation in developing new technology nodes.
arXiv Detail & Related papers (2024-09-02T16:02:26Z) - Derivative-Free Guidance in Continuous and Discrete Diffusion Models with Soft Value-Based Decoding [84.3224556294803]
Diffusion models excel at capturing the natural design spaces of images, molecules, DNA, RNA, and protein sequences.
We aim to optimize downstream reward functions while preserving the naturalness of these design spaces.
Our algorithm integrates soft value functions, which looks ahead to how intermediate noisy states lead to high rewards in the future.
arXiv Detail & Related papers (2024-08-15T16:47:59Z) - Bridging Model-Based Optimization and Generative Modeling via Conservative Fine-Tuning of Diffusion Models [54.132297393662654]
We introduce a hybrid method that fine-tunes cutting-edge diffusion models by optimizing reward models through RL.
We demonstrate the capability of our approach to outperform the best designs in offline data, leveraging the extrapolation capabilities of reward models.
arXiv Detail & Related papers (2024-05-30T03:57:29Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - DiffPattern: Layout Pattern Generation via Discrete Diffusion [16.148506119712735]
We propose toolDiffPattern to generate reliable layout patterns.
Our experiments on several benchmark settings show that toolDiffPattern significantly outperforms existing baselines.
arXiv Detail & Related papers (2023-03-23T06:16:14Z) - Hierarchically branched diffusion models leverage dataset structure for
class-conditional generation [0.6800113478497425]
Branched diffusion models rely on the same diffusion process as traditional models, but learn reverse diffusion separately for each branch of a hierarchy.
We extensively evaluate branched diffusion models on several benchmark and large real-world scientific datasets spanning many data modalities.
arXiv Detail & Related papers (2022-12-21T05:27:23Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Deblurring via Stochastic Refinement [85.42730934561101]
We present an alternative framework for blind deblurring based on conditional diffusion models.
Our method is competitive in terms of distortion metrics such as PSNR.
arXiv Detail & Related papers (2021-12-05T04:36:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.