Geotechnical Parrot Tales (GPT): Harnessing Large Language Models in
geotechnical engineering
- URL: http://arxiv.org/abs/2304.02138v3
- Date: Wed, 21 Jun 2023 15:55:24 GMT
- Title: Geotechnical Parrot Tales (GPT): Harnessing Large Language Models in
geotechnical engineering
- Authors: Krishna Kumar
- Abstract summary: GPT models can generate plausible-sounding but false outputs, leading to hallucinations.
By integrating GPT into geotechnical engineering, professionals can streamline their work and develop sustainable and resilient infrastructure systems.
- Score: 2.132096006921048
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The widespread adoption of large language models (LLMs), such as OpenAI's
ChatGPT, could revolutionize various industries, including geotechnical
engineering. However, GPT models can sometimes generate plausible-sounding but
false outputs, leading to hallucinations. In this article, we discuss the
importance of prompt engineering in mitigating these risks and harnessing the
full potential of GPT for geotechnical applications. We explore the challenges
and pitfalls associated with LLMs and highlight the role of context in ensuring
accurate and valuable responses. Furthermore, we examine the development of
context-specific search engines and the potential of LLMs to become a natural
interface for complex tasks, such as data analysis and design. We also develop
a unified interface using natural language to handle complex geotechnical
engineering tasks and data analysis. By integrating GPT into geotechnical
engineering workflows, professionals can streamline their work and develop
sustainable and resilient infrastructure systems for the future.
Related papers
- WorldGPT: Empowering LLM as Multimodal World Model [51.243464216500975]
We introduce WorldGPT, a generalist world model built upon Multimodal Large Language Model (MLLM)
WorldGPT acquires an understanding of world dynamics through analyzing millions of videos across various domains.
We conduct evaluations on WorldNet, a multimodal state transition prediction benchmark.
arXiv Detail & Related papers (2024-04-28T14:42:02Z) - GeoGalactica: A Scientific Large Language Model in Geoscience [95.15911521220052]
Large language models (LLMs) have achieved huge success for their general knowledge and ability to solve a wide spectrum of tasks in natural language processing (NLP)
We specialize an LLM into geoscience, by further pre-training the model with a vast amount of texts in geoscience, as well as supervised fine-tuning (SFT) the resulting model with our custom collected instruction tuning dataset.
We train GeoGalactica over a geoscience-related text corpus containing 65 billion tokens, preserving as the largest geoscience-specific text corpus.
Then we fine-tune the model with 1 million pairs of instruction-tuning
arXiv Detail & Related papers (2023-12-31T09:22:54Z) - Future-proofing geotechnics workflows: accelerating problem-solving with
large language models [2.8414492326907577]
This paper delves into the innovative application of Large Language Models in geotechnical engineering, as explored in a hands-on workshop held in Tokyo, Japan.
The paper discusses the potential of LLMs to transform geotechnical engineering practices, highlighting their proficiency in handling a range of tasks from basic data analysis to complex problem-solving.
arXiv Detail & Related papers (2023-12-14T05:17:27Z) - Core Building Blocks: Next Gen Geo Spatial GPT Application [0.0]
This paper introduces MapGPT, which aims to bridge the gap between natural language understanding and spatial data analysis.
MapGPT enables more accurate and contextually aware responses to location-based queries.
arXiv Detail & Related papers (2023-10-17T06:59:31Z) - SoTaNa: The Open-Source Software Development Assistant [81.86136560157266]
SoTaNa is an open-source software development assistant.
It generates high-quality instruction-based data for the domain of software engineering.
It employs a parameter-efficient fine-tuning approach to enhance the open-source foundation model, LLaMA.
arXiv Detail & Related papers (2023-08-25T14:56:21Z) - GeoGPT: Understanding and Processing Geospatial Tasks through An
Autonomous GPT [6.618846295332767]
Decision-makers in GIS need to combine a series of spatial algorithms and operations to solve geospatial tasks.
We develop a new framework called GeoGPT that can conduct geospatial data collection, processing, and analysis in an autonomous manner.
arXiv Detail & Related papers (2023-07-16T03:03:59Z) - GPT4GEO: How a Language Model Sees the World's Geography [31.215906518290883]
We investigate the degree to which GPT-4 has acquired factual geographic knowledge.
This knowledge is especially important for applications that involve geographic data.
We provide a broad characterisation of what GPT-4 knows about the world, highlighting both potentially surprising capabilities but also limitations.
arXiv Detail & Related papers (2023-05-30T18:28:04Z) - Generative Pre-trained Transformer: A Comprehensive Review on Enabling
Technologies, Potential Applications, Emerging Challenges, and Future
Directions [11.959434388955787]
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the domain of natural language processing.
GPT is based on the transformer architecture, a deep neural network designed for natural language processing tasks.
arXiv Detail & Related papers (2023-05-11T19:20:38Z) - Just Tell Me: Prompt Engineering in Business Process Management [63.08166397142146]
GPT-3 and other language models (LMs) can effectively address various natural language processing (NLP) tasks.
We argue that prompt engineering can help bring the capabilities of LMs to BPM research.
arXiv Detail & Related papers (2023-04-14T14:55:19Z) - A General Purpose Neural Architecture for Geospatial Systems [142.43454584836812]
We present a roadmap towards the construction of a general-purpose neural architecture (GPNA) with a geospatial inductive bias.
We envision how such a model may facilitate cooperation between members of the community.
arXiv Detail & Related papers (2022-11-04T09:58:57Z) - Robust Conversational AI with Grounded Text Generation [77.56950706340767]
GTG is a hybrid model which uses a large-scale Transformer neural network as its backbone.
It generates responses grounded in dialog belief state and real-world knowledge for task completion.
arXiv Detail & Related papers (2020-09-07T23:49:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.