MOTIF-RF: Multi-template On-chip Transformer Synthesis Incorporating Frequency-domain Self-transfer Learning for RFIC Design Automation
- URL: http://arxiv.org/abs/2511.21970v1
- Date: Wed, 26 Nov 2025 23:03:34 GMT
- Title: MOTIF-RF: Multi-template On-chip Transformer Synthesis Incorporating Frequency-domain Self-transfer Learning for RFIC Design Automation
- Authors: Houbo He, Yizhou Xu, Lei Xia, Yaolong Hu, Fan Cai, Taiyun Chi,
- Abstract summary: We develop multi-template machine learning (ML) surrogate models and apply them to the inverse design of transformers (XFMRs) in radio-frequency integrated circuits (RFICs)<n>New frequency-domain self-transfer learning technique exploits correlations between adjacent frequency bands, leading to around 30%-50% accuracy improvement in the S-s prediction.<n>Inverse design framework is validated using multiple impedance-matching tasks, all demonstrating fast and trustworthy performance.
- Score: 6.037841678034353
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a systematic study on developing multi-template machine learning (ML) surrogate models and applying them to the inverse design of transformers (XFMRs) in radio-frequency integrated circuits (RFICs). Our study starts with benchmarking four widely used ML architectures, including MLP-, CNN-, UNet-, and GT-based models, using the same datasets across different XFMR topologies. To improve modeling accuracy beyond these baselines, we then propose a new frequency-domain self-transfer learning technique that exploits correlations between adjacent frequency bands, leading to around 30%-50% accuracy improvement in the S-parameters prediction. Building on these models, we further develop an inverse design framework based on the covariance matrix adaptation evolutionary strategy (CMA-ES) algorithm. This framework is validated using multiple impedance-matching tasks, all demonstrating fast convergence and trustworthy performance. These results advance the goal of AI-assisted specs-to-GDS automation for RFICs and provide RFIC designers with actionable tools for integrating AI into their workflows.
Related papers
- Fast and Accurate RFIC Performance Prediction via Pin Level Graph Neural Networks and Probabilistic Flow [0.5599792629509228]
This work proposes a lightweight, data-efficient, and topology-aware graph neural network (GNN) model for predicting key performance metrics of active RF circuits.<n> circuits are modeled at the device-terminal level, enabling scalable message passing while reducing data requirements.<n>Experiments on datasets demonstrate high prediction accuracy, with symmetric mean absolute percentage error (sMAPE) and mean relative error (MRE) averaging 2.40% and 2.91%, respectively.
arXiv Detail & Related papers (2025-08-22T14:06:21Z) - GENIAL: Generative Design Space Exploration via Network Inversion for Low Power Algorithmic Logic Units [4.148469311862123]
We introduce a machine learning-based framework for the automatic generation and optimization of arithmetic units.<n>At the core of GENIAL is a Transformer-based surrogate model trained in two stages.<n>Experiments on large datasets demonstrate that GENIAL is consistently more sample efficient than other methods.
arXiv Detail & Related papers (2025-07-25T06:34:59Z) - Dynamic Acoustic Model Architecture Optimization in Training for ASR [51.21112094223223]
DMAO is an architecture optimization framework that employs a grow-and-drop strategy to automatically reallocate parameters during training.<n>We evaluate DMAO through experiments with CTC onSpeech, TED-LIUM-v2 and Switchboard datasets.
arXiv Detail & Related papers (2025-06-16T07:47:34Z) - A Multi-Step Comparative Framework for Anomaly Detection in IoT Data Streams [0.9208007322096533]
Internet of Things (IoT) devices have introduced critical security challenges, underscoring the need for accurate anomaly detection.<n>This paper presents a multi-step evaluation framework assessing the combined impact of preprocessing choices on three machine learning algorithms.<n> Experiments on the IoTID20 dataset shows that GBoosting consistently delivers superior accuracy across preprocessing configurations.
arXiv Detail & Related papers (2025-05-22T16:28:22Z) - AI/ML-Based Automatic Modulation Recognition: Recent Trends and Future Possibilities [0.0]
We present a review of high-performance automatic modulation recognition (AMR) models proposed in the literature to classify various Radio Frequency (RF) modulation schemes.<n>We replicated these models and compared their performance in terms of accuracy across a range of signal-to-noise ratios.
arXiv Detail & Related papers (2025-02-07T20:34:04Z) - Enhancing Automatic Modulation Recognition for IoT Applications Using Transformers [2.258538713779673]
This paper presents an innovative approach that leverages Transformer networks, initially designed for natural language processing.
Four tokenization techniques are proposed and explored for creating proper embeddings of RF signals.
Our model achieves an accuracy of 65.75 on the RML2016 and 65.80 on the CSPB.ML. 2018+ dataset.
arXiv Detail & Related papers (2024-03-08T21:33:03Z) - Machine Learning Insides OptVerse AI Solver: Design Principles and
Applications [74.67495900436728]
We present a comprehensive study on the integration of machine learning (ML) techniques into Huawei Cloud's OptVerse AI solver.
We showcase our methods for generating complex SAT and MILP instances utilizing generative models that mirror multifaceted structures of real-world problem.
We detail the incorporation of state-of-the-art parameter tuning algorithms which markedly elevate solver performance.
arXiv Detail & Related papers (2024-01-11T15:02:15Z) - Transformers as Statisticians: Provable In-Context Learning with
In-Context Algorithm Selection [88.23337313766353]
This work first provides a comprehensive statistical theory for transformers to perform ICL.
We show that transformers can implement a broad class of standard machine learning algorithms in context.
A emphsingle transformer can adaptively select different base ICL algorithms.
arXiv Detail & Related papers (2023-06-07T17:59:31Z) - End-to-End Meta-Bayesian Optimisation with Transformer Neural Processes [52.818579746354665]
This paper proposes the first end-to-end differentiable meta-BO framework that generalises neural processes to learn acquisition functions via transformer architectures.
We enable this end-to-end framework with reinforcement learning (RL) to tackle the lack of labelled acquisition data.
arXiv Detail & Related papers (2023-05-25T10:58:46Z) - Model-based Deep Learning Receiver Design for Rate-Splitting Multiple
Access [65.21117658030235]
This work proposes a novel design for a practical RSMA receiver based on model-based deep learning (MBDL) methods.
The MBDL receiver is evaluated in terms of uncoded Symbol Error Rate (SER), throughput performance through Link-Level Simulations (LLS) and average training overhead.
Results reveal that the MBDL outperforms by a significant margin the SIC receiver with imperfect CSIR.
arXiv Detail & Related papers (2022-05-02T12:23:55Z) - Optimization-driven Machine Learning for Intelligent Reflecting Surfaces
Assisted Wireless Networks [82.33619654835348]
Intelligent surface (IRS) has been employed to reshape the wireless channels by controlling individual scattering elements' phase shifts.
Due to the large size of scattering elements, the passive beamforming is typically challenged by the high computational complexity.
In this article, we focus on machine learning (ML) approaches for performance in IRS-assisted wireless networks.
arXiv Detail & Related papers (2020-08-29T08:39:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.