piHyFlow Operational Semantics
- URL: http://arxiv.org/abs/2310.19818v1
- Date: Fri, 20 Oct 2023 17:37:39 GMT
- Title: piHyFlow Operational Semantics
- Authors: Fernando J. Barros
- Abstract summary: piHyFlow is a formalism for representing hybrid models using a set of communicating processes.
Process are encapsulated into piHyFlow base models and communicate through shared memory.
piHyFlow can guarantee modularity by enforcing that models can only communicate by input and output interfaces.
- Score: 65.268245109828
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simulation models have been described using different perspectives, or
worldviews. In the process interaction world view (PI), every entity is modeled
by a sequence of actions describing its life cycle, offering a comprehensive
model that groups the events involving each entity. In this paper we describe
piHyFlow, a formalism for representing hybrid models using a set of
communicating processes. This set is dynamic, enabling processes to be created
and destroyed at runtime. Processes are encapsulated into piHyFlow base models
and communicate through shared memory. piHyFlow, however, can guarantee
modularity by enforcing that models can only communicate by input and output
interfaces. piHyFlow extends current PI approaches by providing support for
HyFlow concepts of sampling and dense (continuous) outputs, in addition to the
more traditional event-based communication. In this paper we present piHyFlow
operational semantics using the concepts of simulator and component.
Related papers
- Leveraging Machine Learning and Enhanced Parallelism Detection for BPMN Model Generation from Text [75.77648333476776]
This paper introduces an automated pipeline for extracting BPMN models from text.<n>A key contribution of this work is the introduction of a newly annotated dataset.<n>We augment the dataset with 15 newly annotated documents containing 32 parallel gateways for model training.
arXiv Detail & Related papers (2025-07-11T07:25:55Z) - JAM-Flow: Joint Audio-Motion Synthesis with Flow Matching [30.02208748898321]
JAM-Flow is a unified framework to simultaneously synthesize and condition on both facial motion and speech.<n>It supports a wide array of conditioning inputs-including text, reference audio, and reference motion-facilitating tasks.
arXiv Detail & Related papers (2025-06-30T06:51:40Z) - EventFlow: Forecasting Continuous-Time Event Data with Flow Matching [12.976042923229466]
We propose EventFlow, a non-autoregressive generative model for temporal point processes.
Our model builds on the flow matching framework in order to directly learn joint distributions over event times, side-stepping the autoregressive process.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - CaLMFlow: Volterra Flow Matching using Causal Language Models [14.035963716966787]
CaLMFlow is a framework that casts flow matching as a Volterra integral equation (VIE)
Our method implements tokenization across space and time, thereby solving a VIE over these domains.
We demonstrate CaLMFlow's effectiveness on synthetic and real-world data, including single-cell perturbation response prediction.
arXiv Detail & Related papers (2024-10-03T05:07:41Z) - ActionFlow: Equivariant, Accurate, and Efficient Policies with Spatially Symmetric Flow Matching [20.20511152176522]
ActionFlow is a policy class that integrates spatial symmetry inductive biases.
On the representation level, ActionFlow introduces an SE(3) Invariant Transformer architecture.
For action generation, ActionFlow leverages Flow Matching, a state-of-the-art deep generative model.
arXiv Detail & Related papers (2024-09-06T19:30:36Z) - PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator [73.80050807279461]
Piecewise Rectified Flow (PeRFlow) is a flow-based method for accelerating diffusion models.
PeRFlow achieves superior performance in a few-step generation.
arXiv Detail & Related papers (2024-05-13T07:10:53Z) - A Partial Replication of MaskFormer in TensorFlow on TPUs for the TensorFlow Model Garden [3.259700715934023]
This paper undertakes the task of replicating the MaskFormer model, originally developed using the PyTorch framework, within the COCO ecosystem.
We address key challenges encountered during the replication, non-convergence issues, slow training, adaptation of loss functions, and the integration of TPU-specific functionalities.
arXiv Detail & Related papers (2024-04-29T15:40:40Z) - Generative Flow Networks for Discrete Probabilistic Modeling [118.81967600750428]
We present energy-based generative flow networks (EB-GFN)
EB-GFN is a novel probabilistic modeling algorithm for high-dimensional discrete data.
We show how GFlowNets can approximately perform large-block Gibbs sampling to mix between modes.
arXiv Detail & Related papers (2022-02-03T01:27:11Z) - OneFlow: Redesign the Distributed Deep Learning Framework from Scratch [17.798586916628174]
OneFlow is a novel distributed training framework based on an SBP (split, broadcast and partial-value) abstraction and the actor model.
SBP enables much easier programming of data parallelism and model parallelism than existing frameworks.
OneFlow outperforms many well-known customized libraries built on top of the state-of-the-art frameworks.
arXiv Detail & Related papers (2021-10-28T11:32:14Z) - TeraPipe: Token-Level Pipeline Parallelism for Training Large-Scale
Language Models [60.23234205219347]
TeraPipe is a high-performance token-level pipeline parallel algorithm for synchronous model-parallel training of Transformer-based language models.
We show that TeraPipe can speed up the training by 5.0x for the largest GPT-3 model with 175 billion parameters on an AWS cluster.
arXiv Detail & Related papers (2021-02-16T07:34:32Z) - Task-Oriented Dialogue as Dataflow Synthesis [158.77123205487334]
We describe an approach to task-oriented dialogue in which dialogue state is represented as a dataflow graph.
A dialogue agent maps each user utterance to a program that extends this graph.
We introduce a new dataset, SMCalFlow, featuring complex dialogues about events, weather, places, and people.
arXiv Detail & Related papers (2020-09-24T00:35:26Z) - Normalizing Flows with Multi-Scale Autoregressive Priors [131.895570212956]
We introduce channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR)
Our mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data.
We show that mAR-SCF allows for improved image generation quality, with gains in FID and Inception scores compared to state-of-the-art flow-based models.
arXiv Detail & Related papers (2020-04-08T09:07:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.