A Survey of Circuit Foundation Model: Foundation AI Models for VLSI Circuit Design and EDA
- URL: http://arxiv.org/abs/2504.03711v1
- Date: Fri, 28 Mar 2025 07:27:27 GMT
- Title: A Survey of Circuit Foundation Model: Foundation AI Models for VLSI Circuit Design and EDA
- Authors: Wenji Fang, Jing Wang, Yao Lu, Shang Liu, Yuchao Wu, Yuzhe Ma, Zhiyao Xie,
- Abstract summary: Foundation AI models for circuits have emerged as a new technology trend.<n>New AI models are developed through two stages: 1) self-supervised pre-training on a large amount of unlabeled data to learn intrinsic circuit properties; and 2) efficient fine-tuning for specific downstream applications.
- Score: 13.026106814794696
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Artificial intelligence (AI)-driven electronic design automation (EDA) techniques have been extensively explored for VLSI circuit design applications. Most recently, foundation AI models for circuits have emerged as a new technology trend. Unlike traditional task-specific AI solutions, these new AI models are developed through two stages: 1) self-supervised pre-training on a large amount of unlabeled data to learn intrinsic circuit properties; and 2) efficient fine-tuning for specific downstream applications, such as early-stage design quality evaluation, circuit-related context generation, and functional verification. This new paradigm brings many advantages: model generalization, less reliance on labeled circuit data, efficient adaptation to new tasks, and unprecedented generative capability. In this paper, we propose referring to AI models developed with this new paradigm as circuit foundation models (CFMs). This paper provides a comprehensive survey of the latest progress in circuit foundation models, unprecedentedly covering over 130 relevant works. Over 90% of our introduced works were published in or after 2022, indicating that this emerging research trend has attracted wide attention in a short period. In this survey, we propose to categorize all existing circuit foundation models into two primary types: 1) encoder-based methods performing general circuit representation learning for predictive tasks; and 2) decoder-based methods leveraging large language models (LLMs) for generative tasks. For our introduced works, we cover their input modalities, model architecture, pre-training strategies, domain adaptation techniques, and downstream design applications. In addition, this paper discussed the unique properties of circuits from the data perspective. These circuit properties have motivated many works in this domain and differentiated them from general AI techniques.
Related papers
- GenEDA: Unleashing Generative Reasoning on Netlist via Multimodal Encoder-Decoder Aligned Foundation Model [8.115489346573918]
GenEDA is a framework that aligns circuit encoders with decoders within a shared latent space.
Built on this architecture, GenEDA enables three unprecedented generative reasoning tasks over netlists.
arXiv Detail & Related papers (2025-04-13T08:56:22Z) - Architect of the Bits World: Masked Autoregressive Modeling for Circuit Generation Guided by Truth Table [5.300504429005315]
We propose a novel approach integrating conditional generative models with differentiable architecture search (DAS) for circuit generation.<n>Our approach first introduces CircuitVQ, a circuit tokenizer trained based on our Circuit AutoEncoder.<n>We then develop CircuitAR, a masked autoregressive model leveraging CircuitVQ as the tokenizer.
arXiv Detail & Related papers (2025-02-18T11:13:03Z) - AI-Aided Kalman Filters [65.35350122917914]
The Kalman filter (KF) and its variants are among the most celebrated algorithms in signal processing.<n>Recent developments illustrate the possibility of fusing deep neural networks (DNNs) with classic Kalman-type filtering.<n>This article provides a tutorial-style overview of design approaches for incorporating AI in aiding KF-type algorithms.
arXiv Detail & Related papers (2024-10-16T06:47:53Z) - Vision Foundation Models in Remote Sensing: A Survey [6.036426846159163]
Foundation models are large-scale, pre-trained AI models capable of performing a wide array of tasks with unprecedented accuracy and efficiency.<n>This survey aims to serve as a resource for researchers and practitioners by providing a panorama of advances and promising pathways for continued development and application of foundation models in remote sensing.
arXiv Detail & Related papers (2024-08-06T22:39:34Z) - Generative AI Models for Different Steps in Architectural Design: A Literature Review [14.910709576423576]
It is essential to comprehend the principles and advancements of generative AI models and analyze their relevance in architecture applications.
This paper first provides an overview of generative AI technologies, with a focus on probabilistic diffusion models (DDPMs), 3D generative models, and foundation models.
We subdivide the architectural design process into six steps and review related research projects in each step from 2020 to the present.
arXiv Detail & Related papers (2024-03-30T13:25:11Z) - Circuit as Set of Points [39.14967611962792]
We propose a novel perspective for circuit design by treating circuit components as point clouds.
This approach enables direct feature extraction from raw data without any preprocessing, allows for end-to-end training, and results in high performance.
arXiv Detail & Related papers (2023-10-26T14:22:43Z) - CktGNN: Circuit Graph Neural Network for Electronic Design Automation [67.29634073660239]
This paper presents a Circuit Graph Neural Network (CktGNN) that simultaneously automates the circuit topology generation and device sizing.
We introduce Open Circuit Benchmark (OCB), an open-sourced dataset that contains $10$K distinct operational amplifiers.
Our work paves the way toward a learning-based open-sourced design automation for analog circuits.
arXiv Detail & Related papers (2023-08-31T02:20:25Z) - Universal Domain Adaptation from Foundation Models: A Baseline Study [58.51162198585434]
We make empirical studies of state-of-the-art UniDA methods using foundation models.
We introduce textitCLIP distillation, a parameter-free method specifically designed to distill target knowledge from CLIP models.
Although simple, our method outperforms previous approaches in most benchmark tasks.
arXiv Detail & Related papers (2023-05-18T16:28:29Z) - FAIR AI Models in High Energy Physics [16.744801048170732]
We propose a practical definition of FAIR principles for AI models in experimental high energy physics.
We describe a template for the application of these principles.
We report on the robustness of this FAIR AI model, its portability across hardware architectures and software frameworks, and its interpretability.
arXiv Detail & Related papers (2022-12-09T19:00:18Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Data-Driven and SE-assisted AI Model Signal-Awareness Enhancement and
Introspection [61.571331422347875]
We propose a data-driven approach to enhance models' signal-awareness.
We combine the SE concept of code complexity with the AI technique of curriculum learning.
We achieve up to 4.8x improvement in model signal awareness.
arXiv Detail & Related papers (2021-11-10T17:58:18Z) - AutoBERT-Zero: Evolving BERT Backbone from Scratch [94.89102524181986]
We propose an Operation-Priority Neural Architecture Search (OP-NAS) algorithm to automatically search for promising hybrid backbone architectures.
We optimize both the search algorithm and evaluation of candidate models to boost the efficiency of our proposed OP-NAS.
Experiments show that the searched architecture (named AutoBERT-Zero) significantly outperforms BERT and its variants of different model capacities in various downstream tasks.
arXiv Detail & Related papers (2021-07-15T16:46:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.