Optimizing Photonic Structures with Large Language Model Driven Algorithm Discovery
- URL: http://arxiv.org/abs/2503.19742v1
- Date: Tue, 25 Mar 2025 15:05:25 GMT
- Title: Optimizing Photonic Structures with Large Language Model Driven Algorithm Discovery
- Authors: Haoran Yin, Anna V. Kononova, Thomas Bäck, Niki van Stein,
- Abstract summary: We introduce structured prompt engineering tailored to multilayer photonic problems such as Bragg mirror, ellipsometry inverse analysis, and solar cell antireflection coatings.<n>We explore multiple evolutionary strategies, including (1+1), (1+5), (2+10), and others, to balance exploration and exploitation.<n>Our experiments show that LLM-generated algorithms, generated using small-scale problem instances, can match or surpass established methods.
- Score: 2.2485774453793037
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study how large language models can be used in combination with evolutionary computation techniques to automatically discover optimization algorithms for the design of photonic structures. Building on the Large Language Model Evolutionary Algorithm (LLaMEA) framework, we introduce structured prompt engineering tailored to multilayer photonic problems such as Bragg mirror, ellipsometry inverse analysis, and solar cell antireflection coatings. We systematically explore multiple evolutionary strategies, including (1+1), (1+5), (2+10), and others, to balance exploration and exploitation. Our experiments show that LLM-generated algorithms, generated using small-scale problem instances, can match or surpass established methods like quasi-oppositional differential evolution on large-scale realistic real-world problem instances. Notably, LLaMEA's self-debugging mutation loop, augmented by automatically extracted problem-specific insights, achieves strong anytime performance and reliable convergence across diverse problem scales. This work demonstrates the feasibility of domain-focused LLM prompts and evolutionary approaches in solving optical design tasks, paving the way for rapid, automated photonic inverse design.
Related papers
- From Understanding to Excelling: Template-Free Algorithm Design through Structural-Functional Co-Evolution [39.42526347710991]
Large language models (LLMs) have greatly accelerated the automation of algorithm generation and optimization.<n>We introduce an end-to-end algorithm generation and optimization framework based on LLMs.<n>Our approach utilizes the deep semantic understanding of LLMs to convert natural language requirements or human-authored papers into code solutions.
arXiv Detail & Related papers (2025-03-13T08:26:18Z) - Bridging Visualization and Optimization: Multimodal Large Language Models on Graph-Structured Combinatorial Optimization [56.17811386955609]
Graph-structured challenges are inherently difficult due to their nonlinear and intricate nature.<n>In this study, we propose transforming graphs into images to preserve their higher-order structural features accurately.<n>By combining the innovative paradigm powered by multimodal large language models with simple search techniques, we aim to develop a novel and effective framework.
arXiv Detail & Related papers (2025-01-21T08:28:10Z) - Deep Insights into Automated Optimization with Large Language Models and Evolutionary Algorithms [3.833708891059351]
Large Language Models (LLMs) and Evolutionary Algorithms (EAs) offer promising new approach to overcome limitations and make optimization more automated.
LLMs act as dynamic agents that can generate, refine, and interpret optimization strategies.
EAs efficiently explore complex solution spaces through evolutionary operators.
arXiv Detail & Related papers (2024-10-28T09:04:49Z) - Cliqueformer: Model-Based Optimization with Structured Transformers [102.55764949282906]
Large neural networks excel at prediction tasks, but their application to design problems, such as protein engineering or materials discovery, requires solving offline model-based optimization (MBO) problems.<n>We present Cliqueformer, a transformer-based architecture that learns the black-box function's structure through functional graphical models (FGM)<n>Across various domains, including chemical and genetic design tasks, Cliqueformer demonstrates superior performance compared to existing methods.
arXiv Detail & Related papers (2024-10-17T00:35:47Z) - Large Language Model Aided Multi-objective Evolutionary Algorithm: a Low-cost Adaptive Approach [4.442101733807905]
This study proposes a new framework that combines a large language model (LLM) with traditional evolutionary algorithms to enhance the algorithm's search capability and generalization performance.
We leverage an auxiliary evaluation function and automated prompt construction within the adaptive mechanism to flexibly adjust the utilization of the LLM.
arXiv Detail & Related papers (2024-10-03T08:37:02Z) - When Large Language Models Meet Evolutionary Algorithms: Potential Enhancements and Challenges [50.280704114978384]
Pre-trained large language models (LLMs) exhibit powerful capabilities for generating natural text.<n> Evolutionary algorithms (EAs) can discover diverse solutions to complex real-world problems.
arXiv Detail & Related papers (2024-01-19T05:58:30Z) - Algorithm Evolution Using Large Language Model [18.03090066194074]
We propose a novel approach called Evolution Algorithm using Large Language Model (AEL)
AEL does algorithm-level evolution without model training.
Human effort and requirements for domain knowledge can be significantly reduced.
arXiv Detail & Related papers (2023-11-26T09:38:44Z) - Learning Controllable Adaptive Simulation for Multi-resolution Physics [86.8993558124143]
We introduce Learning controllable Adaptive simulation for Multi-resolution Physics (LAMP) as the first full deep learning-based surrogate model.
LAMP consists of a Graph Neural Network (GNN) for learning the forward evolution, and a GNN-based actor-critic for learning the policy of spatial refinement and coarsening.
We demonstrate that our LAMP outperforms state-of-the-art deep learning surrogate models, and can adaptively trade-off computation to improve long-term prediction error.
arXiv Detail & Related papers (2023-05-01T23:20:27Z) - TMM-Fast: A Transfer Matrix Computation Package for Multilayer Thin-Film
Optimization [62.997667081978825]
An advanced thin-film structure can consist of multiple materials with different thicknesses and numerous layers.
Design and optimization of complex thin-film structures with multiple variables is a computationally heavy problem that is still under active research.
We propose the Python package TMM-Fast which enables parallelized computation of reflection and transmission of light at different angles of incidence and wavelengths through the multilayer thin-film.
arXiv Detail & Related papers (2021-11-24T14:47:37Z) - A Framework for Discovering Optimal Solutions in Photonic Inverse Design [0.0]
Photonic inverse design has emerged as an indispensable engineering tool for complex optical systems.
Finding solutions approaching global optimum may present a computationally intractable task.
We develop a framework that allows expediting the search of solutions close to global optimum on complex optimization spaces.
arXiv Detail & Related papers (2021-06-03T22:11:03Z) - Investigating Bi-Level Optimization for Learning and Vision from a
Unified Perspective: A Survey and Beyond [114.39616146985001]
In machine learning and computer vision fields, despite the different motivations and mechanisms, a lot of complex problems contain a series of closely related subproblms.
In this paper, we first uniformly express these complex learning and vision problems from the perspective of Bi-Level Optimization (BLO)
Then we construct a value-function-based single-level reformulation and establish a unified algorithmic framework to understand and formulate mainstream gradient-based BLO methodologies.
arXiv Detail & Related papers (2021-01-27T16:20:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.