Knowledge-Driven Modulation of Neural Networks with Attention Mechanism
for Next Activity Prediction
- URL: http://arxiv.org/abs/2312.08847v1
- Date: Thu, 14 Dec 2023 12:02:35 GMT
- Title: Knowledge-Driven Modulation of Neural Networks with Attention Mechanism
for Next Activity Prediction
- Authors: Ivan Donadello, Jonghyeon Ko, Fabrizio Maria Maggi, Jan Mendling,
Francesco Riva and Matthias Weidlich
- Abstract summary: We present a Symbolic[Neuro] system that leverages background knowledge expressed in terms of a procedural process model to offset the under-sampling in the training data.
More specifically, we make predictions using NNs with attention mechanism, an emerging technology in the NN field.
The system has been tested on several real-life logs showing an improvement in the performance of the prediction task.
- Score: 8.552757384215813
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Predictive Process Monitoring (PPM) aims at leveraging historic process
execution data to predict how ongoing executions will continue up to their
completion. In recent years, PPM techniques for the prediction of the next
activities have matured significantly, mainly thanks to the use of Neural
Networks (NNs) as a predictor. While their performance is difficult to beat in
the general case, there are specific situations where background process
knowledge can be helpful. Such knowledge can be leveraged for improving the
quality of predictions for exceptional process executions or when the process
changes due to a concept drift. In this paper, we present a Symbolic[Neuro]
system that leverages background knowledge expressed in terms of a procedural
process model to offset the under-sampling in the training data. More
specifically, we make predictions using NNs with attention mechanism, an
emerging technology in the NN field. The system has been tested on several
real-life logs showing an improvement in the performance of the prediction
task.
Related papers
- Learning-Augmented Algorithms with Explicit Predictors [67.02156211760415]
Recent advances in algorithmic design show how to utilize predictions obtained by machine learning models from past and present data.
Prior research in this context was focused on a paradigm where the predictor is pre-trained on past data and then used as a black box.
In this work, we unpack the predictor and integrate the learning problem it gives rise for within the algorithmic challenge.
arXiv Detail & Related papers (2024-03-12T08:40:21Z) - Deep Neural Networks Tend To Extrapolate Predictably [51.303814412294514]
neural network predictions tend to be unpredictable and overconfident when faced with out-of-distribution (OOD) inputs.
We observe that neural network predictions often tend towards a constant value as input data becomes increasingly OOD.
We show how one can leverage our insights in practice to enable risk-sensitive decision-making in the presence of OOD inputs.
arXiv Detail & Related papers (2023-10-02T03:25:32Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - MARS: Meta-Learning as Score Matching in the Function Space [79.73213540203389]
We present a novel approach to extracting inductive biases from a set of related datasets.
We use functional Bayesian neural network inference, which views the prior as a process and performs inference in the function space.
Our approach can seamlessly acquire and represent complex prior knowledge by metalearning the score function of the data-generating process.
arXiv Detail & Related papers (2022-10-24T15:14:26Z) - Preference Enhanced Social Influence Modeling for Network-Aware Cascade
Prediction [59.221668173521884]
We propose a novel framework to promote cascade size prediction by enhancing the user preference modeling.
Our end-to-end method makes the user activating process of information diffusion more adaptive and accurate.
arXiv Detail & Related papers (2022-04-18T09:25:06Z) - Multi-head Temporal Attention-Augmented Bilinear Network for Financial
time series prediction [77.57991021445959]
We propose a neural layer based on the ideas of temporal attention and multi-head attention to extend the capability of the underlying neural network.
The effectiveness of our approach is validated using large-scale limit-order book market data.
arXiv Detail & Related papers (2022-01-14T14:02:19Z) - Embedding Graph Convolutional Networks in Recurrent Neural Networks for
Predictive Monitoring [0.0]
This paper proposes an approach based on graph convolutional networks and recurrent neural networks.
An experimental evaluation on real-life event logs shows that our approach is more consistent and outperforms the current state-of-the-art approaches.
arXiv Detail & Related papers (2021-12-17T17:30:30Z) - A systematic literature review on state-of-the-art deep learning methods
for process prediction [0.0]
In recent years, multiple process prediction approaches have been proposed, applying different data processing schemes and prediction algorithms.
This study focuses on deep learning algorithms since they seem to outperform their machine learning alternatives consistently.
The set of log-data, evaluation metrics and baselines used by the authors diverge, making the results hard to compare.
arXiv Detail & Related papers (2021-01-22T20:26:40Z) - Predictive Process Model Monitoring using Recurrent Neural Networks [2.4029798593292706]
This paper introduces Processes-As-Movies (PAM), a technique that provides a middle ground between predictive monitoring.
It does so by capturing declarative process constraints between activities in various windows of a process execution trace.
Various recurrent neural network topologies tailored to high-dimensional input are used to model the process model evolution with windows as time steps.
arXiv Detail & Related papers (2020-11-05T13:57:33Z) - XNAP: Making LSTM-based Next Activity Predictions Explainable by Using
LRP [0.415623340386296]
Predictive business process monitoring (PBPM) is a class of techniques designed to predict behaviour, such as next activities, in running traces.
With the use of deep neural networks (DNNs), the techniques predictive quality could be improved for tasks like the next activity prediction.
In this paper, we propose XNAP, the first explainable, DNN-based PBPM technique for the next activity prediction.
arXiv Detail & Related papers (2020-08-18T15:40:07Z) - Predictive Business Process Monitoring via Generative Adversarial Nets:
The Case of Next Event Prediction [0.026249027950824504]
This paper proposes a novel adversarial training framework to address the problem of next event prediction.
It works by putting one neural network against the other in a two-player game which leads to predictions that are indistinguishable from the ground truth.
It systematically outperforms all baselines both in terms of accuracy and earliness of the prediction, despite using a simple network architecture and a naive feature encoding.
arXiv Detail & Related papers (2020-03-25T08:31:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.