LLM-based AI Agent for Sizing of Analog and Mixed Signal Circuit
- URL: http://arxiv.org/abs/2504.11497v1
- Date: Mon, 14 Apr 2025 22:18:16 GMT
- Title: LLM-based AI Agent for Sizing of Analog and Mixed Signal Circuit
- Authors: Chang Liu, Emmanuel A. Olowe, Danial Chitnis,
- Abstract summary: Large Language Models (LLMs) have demonstrated significant potential across various fields.<n>In this work, we propose an LLM-based AI agent for AMS circuit design to assist in the sizing process.
- Score: 2.979579757819132
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The design of Analog and Mixed-Signal (AMS) integrated circuits (ICs) often involves significant manual effort, especially during the transistor sizing process. While Machine Learning techniques in Electronic Design Automation (EDA) have shown promise in reducing complexity and minimizing human intervention, they still face challenges such as numerous iterations and a lack of knowledge about AMS circuit design. Recently, Large Language Models (LLMs) have demonstrated significant potential across various fields, showing a certain level of knowledge in circuit design and indicating their potential to automate the transistor sizing process. In this work, we propose an LLM-based AI agent for AMS circuit design to assist in the sizing process. By integrating LLMs with external circuit simulation tools and data analysis functions and employing prompt engineering strategies, the agent successfully optimized multiple circuits to achieve target performance metrics. We evaluated the performance of different LLMs to assess their applicability and optimization effectiveness across seven basic circuits, and selected the best-performing model Claude 3.5 Sonnet for further exploration on an operational amplifier, with complementary input stage and class AB output stage. This circuit was evaluated against nine performance metrics, and we conducted experiments under three distinct performance requirement groups. A success rate of up to 60% was achieved for reaching the target requirements. Overall, this work demonstrates the potential of LLMs to improve AMS circuit design.
Related papers
- ECM: A Unified Electronic Circuit Model for Explaining the Emergence of In-Context Learning and Chain-of-Thought in Large Language Model [64.22300168242221]
In-Context Learning (ICL) and Chain-of-Thought (CoT) are emerging capabilities in large language models.<n>We propose the Electronic Circuit Model (ECM) to better understand ICL and CoT.<n>We show that ECM effectively predicts and explains LLM performance across a variety of prompting strategies.
arXiv Detail & Related papers (2025-02-05T16:22:33Z) - Supervised Learning for Analog and RF Circuit Design: Benchmarks and Comparative Insights [10.354863964933019]
This study explores supervised ML-based approaches for designing circuit parameters from performance specifications across various circuit types.<n>Our results show that simpler circuits, such as low-noise amplifiers, achieve exceptional accuracy with mean relative errors as low as 0.3%.<n>For heterogeneous circuits, our approach achieves an 88% reduction in errors with increased training data, with the receiver achieving a mean relative error as low as 0.23%.
arXiv Detail & Related papers (2025-01-21T02:48:23Z) - MALT: Improving Reasoning with Multi-Agent LLM Training [66.9481561915524]
MALT (Multi-Agent LLM Training) is a novel post-training strategy that divides the reasoning process into generation, verification, and refinement steps.<n>On MATH, GSM8K, and CSQA, MALT surpasses the same baseline LLM with a relative improvement of 15.66%, 7.42%, and 9.40% respectively.
arXiv Detail & Related papers (2024-12-02T19:30:36Z) - AMSnet-KG: A Netlist Dataset for LLM-based AMS Circuit Auto-Design Using Knowledge Graph RAG [15.61553255884534]
Large language models (LLMs) have emerged as powerful tools for Electronic Design Automation (EDA) applications.
This paper introduces AMSnet-KG, a dataset encompassing various AMS circuit schematics and netlists.
We propose an automated AMS circuit generation framework that utilizes the comprehensive knowledge embedded in LLMs.
arXiv Detail & Related papers (2024-11-07T02:49:53Z) - Logic Synthesis Optimization with Predictive Self-Supervision via Causal Transformers [19.13500546022262]
We introduce LSOformer, a novel approach harnessing Autoregressive transformer models and predictive SSL to predict the trajectory of Quality of Results (QoR)<n>LSOformer integrates cross-attention modules to merge insights from circuit graphs and optimization sequences, thereby enhancing prediction accuracy for QoR metrics.
arXiv Detail & Related papers (2024-09-16T18:45:07Z) - CoMMIT: Coordinated Instruction Tuning for Multimodal Large Language Models [68.64605538559312]
In this paper, we analyze the MLLM instruction tuning from both theoretical and empirical perspectives.
Inspired by our findings, we propose a measurement to quantitatively evaluate the learning balance.
In addition, we introduce an auxiliary loss regularization method to promote updating of the generation distribution of MLLMs.
arXiv Detail & Related papers (2024-07-29T23:18:55Z) - New Solutions on LLM Acceleration, Optimization, and Application [14.995654657013741]
Large Language Models (LLMs) have become extremely potent instruments with exceptional capacities for comprehending and producing human-like text in a range of applications.
However, the increasing size and complexity of LLMs present significant challenges in both training and deployment.
We provide a review of recent advancements and research directions aimed at addressing these challenges.
arXiv Detail & Related papers (2024-06-16T11:56:50Z) - Efficient Prompting for LLM-based Generative Internet of Things [88.84327500311464]
Large language models (LLMs) have demonstrated remarkable capacities on various tasks, and integrating the capacities of LLMs into the Internet of Things (IoT) applications has drawn much research attention recently.
Due to security concerns, many institutions avoid accessing state-of-the-art commercial LLM services, requiring the deployment and utilization of open-source LLMs in a local network setting.
We propose a LLM-based Generative IoT (GIoT) system deployed in the local network setting in this study.
arXiv Detail & Related papers (2024-06-14T19:24:00Z) - Characterization of Large Language Model Development in the Datacenter [55.9909258342639]
Large Language Models (LLMs) have presented impressive performance across several transformative tasks.
However, it is non-trivial to efficiently utilize large-scale cluster resources to develop LLMs.
We present an in-depth characterization study of a six-month LLM development workload trace collected from our GPU datacenter Acme.
arXiv Detail & Related papers (2024-03-12T13:31:14Z) - LLM4EDA: Emerging Progress in Large Language Models for Electronic
Design Automation [74.7163199054881]
Large Language Models (LLMs) have demonstrated their capability in context understanding, logic reasoning and answer generation.
We present a systematic study on the application of LLMs in the EDA field.
We highlight the future research direction, focusing on applying LLMs in logic synthesis, physical design, multi-modal feature extraction and alignment of circuits.
arXiv Detail & Related papers (2023-12-28T15:09:14Z) - Adaptive Planning Search Algorithm for Analog Circuit Verification [53.97809573610992]
We propose a machine learning (ML) approach, which uses less simulations.
We show that the proposed approach is able to provide OCCs closer to the specifications for all circuits.
arXiv Detail & Related papers (2023-06-23T12:57:46Z) - Analog/Mixed-Signal Circuit Synthesis Enabled by the Advancements of
Circuit Architectures and Machine Learning Algorithms [0.0]
We will focus on using neural-network-based surrogate models to expedite the circuit design parameter search and layout iterations.
Lastly, we will demonstrate the rapid synthesis of several AMS circuit examples from specification to silicon prototype, with significantly reduced human intervention.
arXiv Detail & Related papers (2021-12-15T01:47:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.