From English to Signal Temporal Logic
- URL: http://arxiv.org/abs/2109.10294v1
- Date: Tue, 21 Sep 2021 16:13:29 GMT
- Title: From English to Signal Temporal Logic
- Authors: Jie He, Ezio Bartocci, Dejan Ni\v{c}kovi\'c, Haris Isakovic and Radu
Grosu
- Abstract summary: We propose DeepSTL, a tool and technique for the translation of informal requirements, given as free English sentences, into Signal Temporal Logic (STL), a formal specification language for cyber-physical systems.
A major challenge to devise such a translator is the lack of publicly available informal requirements and formal specifications.
In the second step, we use a state-of-the-art transformer-based neural translation technique, to train an accurate attentional translator of English to STL.
- Score: 7.6398837478968025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Formal methods provide very powerful tools and techniques for the design and
analysis of complex systems. Their practical application remains however
limited, due to the widely accepted belief that formal methods require
extensive expertise and a steep learning curve. Writing correct formal
specifications in form of logical formulas is still considered to be a
difficult and error prone task.
In this paper we propose DeepSTL, a tool and technique for the translation of
informal requirements, given as free English sentences, into Signal Temporal
Logic (STL), a formal specification language for cyber-physical systems, used
both by academia and advanced research labs in industry. A major challenge to
devise such a translator is the lack of publicly available informal
requirements and formal specifications. We propose a two-step workflow to
address this challenge. We first design a grammar-based generation technique of
synthetic data, where each output is a random STL formula and its associated
set of possible English translations. In the second step, we use a
state-of-the-art transformer-based neural translation technique, to train an
accurate attentional translator of English to STL. The experimental results
show high translation quality for patterns of English requirements that have
been well trained, making this workflow promising to be extended for processing
more complex translation tasks.
Related papers
- An Agile Formal Specification Language Design Based on K Framework [11.042686307366818]
This paper introduces an Agile Formal Specification Language (ASL)
The design of ASL incorporates agile design principles, making the writing of formal specifications simpler, more efficient, and scalable.
A specification translation algorithm is developed, capable of converting ASL into K formal specification language that can be executed for verification.
arXiv Detail & Related papers (2024-04-29T09:00:50Z) - On Conditional and Compositional Language Model Differentiable Prompting [75.76546041094436]
Prompts have been shown to be an effective method to adapt a frozen Pretrained Language Model (PLM) to perform well on downstream tasks.
We propose a new model, Prompt Production System (PRopS), which learns to transform task instructions or input metadata, into continuous prompts.
arXiv Detail & Related papers (2023-07-04T02:47:42Z) - Data-Efficient Learning of Natural Language to Linear Temporal Logic
Translators for Robot Task Specification [6.091096843566857]
We present a learning-based approach for translating from natural language commands to specifications with very limited human-labeled training data.
This is in stark contrast to existing natural-language to translators, which require large human-labeled datasets.
We show that we can translate natural language commands at 75% accuracy with far less human data.
arXiv Detail & Related papers (2023-03-09T00:09:58Z) - nl2spec: Interactively Translating Unstructured Natural Language to
Temporal Logics with Large Language Models [3.1143846686797314]
We present nl2spec, a framework for applying Large Language Models (LLMs) derive formal specifications from unstructured natural language.
We introduce a new methodology to detect and resolve the inherent ambiguity of system requirements in natural language.
Users iteratively add, delete, and edit these sub-translations to amend erroneous formalizations, which is easier than manually redrafting the entire formalization.
arXiv Detail & Related papers (2023-03-08T20:08:53Z) - The Whole Truth and Nothing But the Truth: Faithful and Controllable
Dialogue Response Generation with Dataflow Transduction and Constrained
Decoding [65.34601470417967]
We describe a hybrid architecture for dialogue response generation that combines the strengths of neural language modeling and rule-based generation.
Our experiments show that this system outperforms both rule-based and learned approaches in human evaluations of fluency, relevance, and truthfulness.
arXiv Detail & Related papers (2022-09-16T09:00:49Z) - Interactive Learning from Natural Language and Demonstrations using
Signal Temporal Logic [5.88797764615148]
Natural language (NL) is ambiguous, real world tasks and their safety requirements need to be communicated unambiguously.
Signal Temporal Logic (STL) is a formal logic that can serve as a versatile, expressive, and unambiguous formal language to describe robotic tasks.
We propose DIALOGUESTL, an interactive approach for learning correct and concise STL formulas from (often) ambiguous NL descriptions.
arXiv Detail & Related papers (2022-07-01T19:08:43Z) - A Template-based Method for Constrained Neural Machine Translation [100.02590022551718]
We propose a template-based method that can yield results with high translation quality and match accuracy while keeping the decoding speed.
The generation and derivation of the template can be learned through one sequence-to-sequence training framework.
Experimental results show that the proposed template-based methods can outperform several representative baselines in lexically and structurally constrained translation tasks.
arXiv Detail & Related papers (2022-05-23T12:24:34Z) - Detecting Text Formality: A Study of Text Classification Approaches [78.11745751651708]
This work proposes the first to our knowledge systematic study of formality detection methods based on statistical, neural-based, and Transformer-based machine learning methods.
We conducted three types of experiments -- monolingual, multilingual, and cross-lingual.
The study shows the overcome of Char BiLSTM model over Transformer-based ones for the monolingual and multilingual formality classification task.
arXiv Detail & Related papers (2022-04-19T16:23:07Z) - Modeling Target-Side Morphology in Neural Machine Translation: A
Comparison of Strategies [72.56158036639707]
Morphologically rich languages pose difficulties to machine translation.
A large amount of differently inflected word surface forms entails a larger vocabulary.
Some inflected forms of infrequent terms typically do not appear in the training corpus.
Linguistic agreement requires the system to correctly match the grammatical categories between inflected word forms in the output sentence.
arXiv Detail & Related papers (2022-03-25T10:13:20Z) - SML: a new Semantic Embedding Alignment Transformer for efficient
cross-lingual Natural Language Inference [71.57324258813674]
The ability of Transformers to perform with precision a variety of tasks such as question answering, Natural Language Inference (NLI) or summarising, have enable them to be ranked as one of the best paradigms to address this kind of tasks at present.
NLI is one of the best scenarios to test these architectures, due to the knowledge required to understand complex sentences and established a relation between a hypothesis and a premise.
In this paper, we propose a new architecture, siamese multilingual transformer, to efficiently align multilingual embeddings for Natural Language Inference.
arXiv Detail & Related papers (2021-03-17T13:23:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.