ProcessTransformer: Predictive Business Process Monitoring with
Transformer Network
- URL: http://arxiv.org/abs/2104.00721v1
- Date: Thu, 1 Apr 2021 18:58:46 GMT
- Title: ProcessTransformer: Predictive Business Process Monitoring with
Transformer Network
- Authors: Zaharah A. Bukhsh, Aaqib Saeed, Remco M. Dijkman
- Abstract summary: We propose ProcessTransformer, an approach for learning high-level representations from event logs with an attention-based network.
Our model incorporates long-range memory and relies on a self-attention mechanism to establish dependencies between a multitude of event sequences and corresponding outputs.
- Score: 0.06445605125467573
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Predictive business process monitoring focuses on predicting future
characteristics of a running process using event logs. The foresight into
process execution promises great potentials for efficient operations, better
resource management, and effective customer services. Deep learning-based
approaches have been widely adopted in process mining to address the
limitations of classical algorithms for solving multiple problems, especially
the next event and remaining-time prediction tasks. Nevertheless, designing a
deep neural architecture that performs competitively across various tasks is
challenging as existing methods fail to capture long-range dependencies in the
input sequences and perform poorly for lengthy process traces. In this paper,
we propose ProcessTransformer, an approach for learning high-level
representations from event logs with an attention-based network. Our model
incorporates long-range memory and relies on a self-attention mechanism to
establish dependencies between a multitude of event sequences and corresponding
outputs. We evaluate the applicability of our technique on nine real event
logs. We demonstrate that the transformer-based model outperforms several
baselines of prior techniques by obtaining on average above 80% accuracy for
the task of predicting the next activity. Our method also perform
competitively, compared to baselines, for the tasks of predicting event time
and remaining time of a running case
Related papers
- Switchable Decision: Dynamic Neural Generation Networks [98.61113699324429]
We propose a switchable decision to accelerate inference by dynamically assigning resources for each data instance.
Our method benefits from less cost during inference while keeping the same accuracy.
arXiv Detail & Related papers (2024-05-07T17:44:54Z) - PGTNet: A Process Graph Transformer Network for Remaining Time Prediction of Business Process Instances [7.724546575875487]
We present PGTNet, an approach that transforms event logs into graph datasets.
We leverage graph-oriented data for training Process Graph Transformer Networks to predict the remaining time of business process instances.
arXiv Detail & Related papers (2024-04-09T12:45:17Z) - Learning Logic Specifications for Policy Guidance in POMDPs: an
Inductive Logic Programming Approach [57.788675205519986]
We learn high-quality traces from POMDP executions generated by any solver.
We exploit data- and time-efficient Indu Logic Programming (ILP) to generate interpretable belief-based policy specifications.
We show that learneds expressed in Answer Set Programming (ASP) yield performance superior to neural networks and similar to optimal handcrafted task-specifics within lower computational time.
arXiv Detail & Related papers (2024-02-29T15:36:01Z) - Detecting Anomalous Events in Object-centric Business Processes via
Graph Neural Networks [55.583478485027]
This study proposes a novel framework for anomaly detection in business processes.
We first reconstruct the process dependencies of the object-centric event logs as attributed graphs.
We then employ a graph convolutional autoencoder architecture to detect anomalous events.
arXiv Detail & Related papers (2024-02-14T14:17:56Z) - Discovering Hierarchical Process Models: an Approach Based on Events
Clustering [0.0]
We present an algorithm for discovering hierarchical process models represented as two-level workflow nets.
Unlike existing solutions, our algorithm does not impose restrictions on the process control flow and allows for iteration.
arXiv Detail & Related papers (2023-03-12T11:05:40Z) - Performance-Preserving Event Log Sampling for Predictive Monitoring [0.3425341633647624]
We propose an instance selection procedure that allows sampling training process instances for prediction models.
We show that our instance selection procedure allows for a significant increase of training speed for next activity and remaining time prediction methods.
arXiv Detail & Related papers (2023-01-18T16:07:56Z) - Avoiding Post-Processing with Event-Based Detection in Biomedical
Signals [69.34035527763916]
We propose an event-based modeling framework that directly works with events as learning targets.
We show that event-based modeling (without post-processing) performs on par with or better than epoch-based modeling with extensive post-processing.
arXiv Detail & Related papers (2022-09-22T13:44:13Z) - What Averages Do Not Tell -- Predicting Real Life Processes with
Sequential Deep Learning [0.1376408511310322]
Process Mining concerns discovering insights on business processes from their execution data that are logged by systems.
Many Deep Learning techniques have been successfully adapted for predictive Process Mining that aims to predict process outcomes.
Traces in Process Mining are multimodal sequences and very differently structured than natural language sentences or images.
arXiv Detail & Related papers (2021-10-19T19:45:05Z) - Predictive Process Model Monitoring using Recurrent Neural Networks [2.4029798593292706]
This paper introduces Processes-As-Movies (PAM), a technique that provides a middle ground between predictive monitoring.
It does so by capturing declarative process constraints between activities in various windows of a process execution trace.
Various recurrent neural network topologies tailored to high-dimensional input are used to model the process model evolution with windows as time steps.
arXiv Detail & Related papers (2020-11-05T13:57:33Z) - Train No Evil: Selective Masking for Task-Guided Pre-Training [97.03615486457065]
We propose a three-stage framework by adding a task-guided pre-training stage with selective masking between general pre-training and fine-tuning.
We show that our method can achieve comparable or even better performance with less than 50% of cost.
arXiv Detail & Related papers (2020-04-21T03:14:22Z) - Subset Sampling For Progressive Neural Network Learning [106.12874293597754]
Progressive Neural Network Learning is a class of algorithms that incrementally construct the network's topology and optimize its parameters based on the training data.
We propose to speed up this process by exploiting subsets of training data at each incremental training step.
Experimental results in object, scene and face recognition problems demonstrate that the proposed approach speeds up the optimization procedure considerably.
arXiv Detail & Related papers (2020-02-17T18:57:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.