A Simple Trace Semantics for Asynchronous Sequence Diagrams
- URL: http://arxiv.org/abs/2501.10981v1
- Date: Sun, 19 Jan 2025 08:09:38 GMT
- Title: A Simple Trace Semantics for Asynchronous Sequence Diagrams
- Authors: David Faitelson, Shmuel Tyszberowicz,
- Abstract summary: Sequence diagrams are a popular technique for describing interactions between software entities.<n>It is impossible to deduce a single interpretation for the notation's semantics.<n>In this work we describe a simple semantics based on the theory of regular languages.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Sequence diagrams are a popular technique for describing interactions between software entities. However, because the OMG group's UML standard is not based on a rigorous mathematical structure, it is impossible to deduce a single interpretation for the notation's semantics, nor to understand precisely how its different fragments interact. While there are a lot of suggested semantics in the literature, they are too mathematically demanding for the majority of software engineers, and often incomplete, especially in dealing with the semantics of lifeline creation and deletion. In this work we describe a simple semantics based on the theory of regular languages, a mathematical theory that is a standard part of the curriculum in every computer science undergraduate degree and covers all the major compositional fragments, and the creation and deletion of lifelines.
Related papers
- Towards Universal Semantics With Large Language Models [4.873927154453253]
We present the first study of using large language models (LLMs) to generate Natural Semantic Metalanguage explications.<n>Our 1B and 8B models outperform GPT-4o in producing accurate, cross-translatable explications.
arXiv Detail & Related papers (2025-05-17T00:11:58Z) - Fundamental Principles of Linguistic Structure are Not Represented by o3 [3.335047764053173]
o3 model fails to generalize basic phrase structure rules.
It fails to correctly rate and explain acceptability dynamics.
It fails to distinguish between instructions to generate unacceptable semantic vs. unacceptable syntactic outputs.
arXiv Detail & Related papers (2025-02-15T23:53:31Z) - Syntax and Semantics Meet in the "Middle": Probing the Syntax-Semantics
Interface of LMs Through Agentivity [68.8204255655161]
We present the semantic notion of agentivity as a case study for probing such interactions.
This suggests LMs may potentially serve as more useful tools for linguistic annotation, theory testing, and discovery.
arXiv Detail & Related papers (2023-05-29T16:24:01Z) - Laziness Is a Virtue When It Comes to Compositionality in Neural
Semantic Parsing [20.856601758389544]
We introduce a neural semantic parsing generation method that constructs logical forms from the bottom up, beginning from the logical form's leaves.
We show that our novel, bottom-up parsing semantic technique outperforms general-purpose semantics while also being competitive with comparable neurals.
arXiv Detail & Related papers (2023-05-07T17:53:08Z) - How Do Transformers Learn Topic Structure: Towards a Mechanistic
Understanding [56.222097640468306]
We provide mechanistic understanding of how transformers learn "semantic structure"
We show, through a combination of mathematical analysis and experiments on Wikipedia data, that the embedding layer and the self-attention layer encode the topical structure.
arXiv Detail & Related papers (2023-03-07T21:42:17Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Semantic-aware Contrastive Learning for More Accurate Semantic Parsing [32.74456368167872]
We propose a semantic-aware contrastive learning algorithm, which can learn to distinguish fine-grained meaning representations.
Experiments on two standard datasets show that our approach achieves significant improvements over MLE baselines.
arXiv Detail & Related papers (2023-01-19T07:04:32Z) - Compositional Temporal Grounding with Structured Variational Cross-Graph
Correspondence Learning [92.07643510310766]
Temporal grounding in videos aims to localize one target video segment that semantically corresponds to a given query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We empirically find that they fail to generalize to queries with novel combinations of seen words.
We propose a variational cross-graph reasoning framework that explicitly decomposes video and language into multiple structured hierarchies.
arXiv Detail & Related papers (2022-03-24T12:55:23Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Sequential composition of answer set programs [0.0]
This paper contributes to the mathematical foundations of logic programming by introducing and studying the sequential composition of answer set programs.
In a broader sense, this paper is a first step towards an algebra of answer set programs and in the future we plan to lift the methods of this paper to wider classes of programs.
arXiv Detail & Related papers (2021-04-25T13:27:22Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Recursive Rules with Aggregation: A Simple Unified Semantics [0.6662800021628273]
This paper describes a unified semantics for recursion with aggregation.
We present a formal definition of the semantics, prove important properties of the semantics, and compare with prior semantics.
We show that our semantics is simple and matches the desired results in all cases.
arXiv Detail & Related papers (2020-07-26T04:42:44Z) - Software Language Comprehension using a Program-Derived Semantics Graph [29.098303489400394]
We present the program-derived semantics graph, a new structure to capture semantics of code.
The PSG is designed to provide a single structure for capturing program semantics at multiple levels of abstraction.
Although our exploration into the PSG is in its infancy, our early results and architectural analysis indicate it is a promising new research direction to automatically extract program semantics.
arXiv Detail & Related papers (2020-04-02T01:37:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.