Generative Pre-Trained Transformers for Biologically Inspired Design
- URL: http://arxiv.org/abs/2204.09714v1
- Date: Thu, 31 Mar 2022 11:13:22 GMT
- Title: Generative Pre-Trained Transformers for Biologically Inspired Design
- Authors: Qihao Zhu, Xinyu Zhang, Jianxi Luo
- Abstract summary: This paper proposes a generative design approach based on the pre-trained language model (PLM)
Three types of design concept generators are identified and fine-tuned from the PLM according to the looseness of the problem space representation.
The approach is then tested via a case study in which the fine-tuned models are applied to generate and evaluate light-weighted flying car concepts inspired by nature.
- Score: 13.852758740799452
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Biological systems in nature have evolved for millions of years to adapt and
survive the environment. Many features they developed can be inspirational and
beneficial for solving technical problems in modern industries. This leads to a
novel form of design-by-analogy called bio-inspired design (BID). Although BID
as a design method has been proven beneficial, the gap between biology and
engineering continuously hinders designers from effectively applying the
method. Therefore, we explore the recent advance of artificial intelligence
(AI) for a computational approach to bridge the gap. This paper proposes a
generative design approach based on the pre-trained language model (PLM) to
automatically retrieve and map biological analogy and generate BID in the form
of natural language. The latest generative pre-trained transformer, namely
GPT-3, is used as the base PLM. Three types of design concept generators are
identified and fine-tuned from the PLM according to the looseness of the
problem space representation. Machine evaluators are also fine-tuned to assess
the correlation between the domains within the generated BID concepts. The
approach is then tested via a case study in which the fine-tuned models are
applied to generate and evaluate light-weighted flying car concepts inspired by
nature. The results show our approach can generate BID concepts with good
performance.
Related papers
- Cliqueformer: Model-Based Optimization with Structured Transformers [102.55764949282906]
We develop a model that learns the structure of an MBO task and empirically leads to improved designs.
We evaluate Cliqueformer on various tasks, ranging from high-dimensional black-box functions to real-world tasks of chemical and genetic design.
arXiv Detail & Related papers (2024-10-17T00:35:47Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Generative Structural Design Integrating BIM and Diffusion Model [4.619347136761891]
This study introduces building information modeling ( BIM) into intelligent structural design and establishes a structural design pipeline integrating BIM and generative AI.
In terms of generation framework, inspired by the process of human drawing, a novel 2-stage generation framework is proposed to reduce the generation difficulty for AI models.
In terms of generative AI tools adopted, diffusion models (DMs) are introduced to replace widely used generative adversarial network (GAN)-based models, and a novel physics-based conditional diffusion model (PCDM) is proposed to consider different design prerequisites.
arXiv Detail & Related papers (2023-11-07T15:05:19Z) - A Neuro-mimetic Realization of the Common Model of Cognition via Hebbian
Learning and Free Energy Minimization [55.11642177631929]
Large neural generative models are capable of synthesizing semantically rich passages of text or producing complex images.
We discuss the COGnitive Neural GENerative system, such an architecture that casts the Common Model of Cognition.
arXiv Detail & Related papers (2023-10-14T23:28:48Z) - Learning Transferable Conceptual Prototypes for Interpretable
Unsupervised Domain Adaptation [79.22678026708134]
In this paper, we propose an inherently interpretable method, named Transferable Prototype Learning ( TCPL)
To achieve this goal, we design a hierarchically prototypical module that transfers categorical basic concepts from the source domain to the target domain and learns domain-shared prototypes for explaining the underlying reasoning process.
Comprehensive experiments show that the proposed method can not only provide effective and intuitive explanations but also outperform previous state-of-the-arts.
arXiv Detail & Related papers (2023-10-12T06:36:41Z) - A Systematic Survey in Geometric Deep Learning for Structure-based Drug
Design [63.30166298698985]
Structure-based drug design (SBDD) utilizes the three-dimensional geometry of proteins to identify potential drug candidates.
Recent developments in geometric deep learning, focusing on the integration and processing of 3D geometric data, have greatly advanced the field of structure-based drug design.
arXiv Detail & Related papers (2023-06-20T14:21:58Z) - Online simulator-based experimental design for cognitive model selection [74.76661199843284]
We propose BOSMOS: an approach to experimental design that can select between computational models without tractable likelihoods.
In simulated experiments, we demonstrate that the proposed BOSMOS technique can accurately select models in up to 2 orders of magnitude less time than existing LFI alternatives.
arXiv Detail & Related papers (2023-03-03T21:41:01Z) - Biologically Inspired Design Concept Generation Using Generative
Pre-Trained Transformers [13.852758740799452]
This paper proposes a generative design approach based on the generative pre-trained language model (PLM)
Three types of design concept generators are identified and fine-tuned from the PLM according to the looseness of the problem space representation.
The approach is evaluated and then employed in a real-world project of designing light-weighted flying cars.
arXiv Detail & Related papers (2022-12-26T16:06:04Z) - Generative Transformers for Design Concept Generation [7.807713821263175]
This study explores the recent advance of the natural language generation (NLG) technique in the artificial intelligence (AI) field.
A novel approach utilizing the generative pre-trained transformer (GPT) is proposed to leverage the knowledge and reasoning from textual data.
Three concept generation tasks are defined to leverage different knowledge and reasoning: domain knowledge synthesis, problem-driven synthesis, and analogy-driven synthesis.
arXiv Detail & Related papers (2022-11-07T11:29:10Z) - Generative Design Ideation: A Natural Language Generation Approach [7.807713821263175]
This paper aims to explore a generative approach for knowledge-based design ideation by applying the latest pre-trained language models in artificial intelligence (AI)
The AI-generated ideas are not only in concise and understandable language but also able to synthesize the target design with external knowledge sources with controllable knowledge distance.
arXiv Detail & Related papers (2022-03-28T08:11:29Z) - Energy Decay Network (EDeN) [0.0]
The Framework attempts to develop a genetic transfer of experience through potential structural expressions.
Successful routes are defined by stability of the spike distribution per epoch.
arXiv Detail & Related papers (2021-03-10T23:17:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.