Auto-ML Graph Neural Network Hypermodels for Outcome Prediction in Event-Sequence Data
- URL: http://arxiv.org/abs/2511.18835v1
- Date: Mon, 24 Nov 2025 07:13:34 GMT
- Title: Auto-ML Graph Neural Network Hypermodels for Outcome Prediction in Event-Sequence Data
- Authors: Fang Wang, Lance Kosca, Adrienne Kosca, Marko Gacesa, Ernesto Damiani,
- Abstract summary: HGNN(O) is an AutoML GNN hypermodel framework for outcome prediction on event-sequence data.<n>We show that HGNN(O) achieves accuracy exceeding 0.98 on the Traffic Fines dataset and weighted F1 scores up to 0.86 on the Patients dataset.
- Score: 2.879694041689511
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces HGNN(O), an AutoML GNN hypermodel framework for outcome prediction on event-sequence data. Building on our earlier work on graph convolutional network hypermodels, HGNN(O) extends four architectures-One Level, Two Level, Two Level Pseudo Embedding, and Two Level Embedding-across six canonical GNN operators. A self-tuning mechanism based on Bayesian optimization with pruning and early stopping enables efficient adaptation over architectures and hyperparameters without manual configuration. Empirical evaluation on both balanced and imbalanced event logs shows that HGNN(O) achieves accuracy exceeding 0.98 on the Traffic Fines dataset and weighted F1 scores up to 0.86 on the Patients dataset without explicit imbalance handling. These results demonstrate that the proposed AutoML-GNN approach provides a robust and generalizable benchmark for outcome prediction in complex event-sequence data.
Related papers
- GNN-Suite: a Graph Neural Network Benchmarking Framework for Biomedical Informatics [0.0]
We present GNN-Suite, a framework for constructing and benchmarking Graph Neural Network (GNN) architectures in computational biology.<n>We demonstrate its utility in identifying cancer-driver genes by constructing molecular networks from protein-protein interaction (PPI) data.<n>Our results show that a common framework for implementing and evaluating GNN architectures aids in identifying not only the best model but also the most effective means of incorporating complementary data.
arXiv Detail & Related papers (2025-05-15T21:14:30Z) - Towards Generalizable Trajectory Prediction Using Dual-Level Representation Learning And Adaptive Prompting [107.4034346788744]
Existing vehicle trajectory prediction models struggle with generalizability, prediction uncertainties, and handling complex interactions.<n>We propose Perceiver with Register queries (PerReg+), a novel trajectory prediction framework that introduces: (1) Dual-Level Representation Learning via Self-Distillation (SD) and Masked Reconstruction (MR), capturing global context and fine-grained details; (2) Enhanced Multimodality using register-based queries and pretraining, eliminating the need for clustering and suppression; and (3) Adaptive Prompt Tuning during fine-tuning, freezing the main architecture and optimizing a small number of prompts for efficient adaptation.
arXiv Detail & Related papers (2025-01-08T20:11:09Z) - Jacobian-Enforced Neural Networks (JENN) for Improved Data Assimilation Consistency in Dynamical Models [0.0]
Machine learning-based weather models have shown great promise in producing accurate forecasts but have struggled when applied to data assimilation tasks.<n>This study introduces the Jacobian-Enforced Neural Network (JENN) framework, designed to enhance DA consistency in neural network (NN)-emulated dynamical systems.
arXiv Detail & Related papers (2024-12-02T00:12:51Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Global Minima, Recoverability Thresholds, and Higher-Order Structure in
GNNS [0.0]
We analyze the performance of graph neural network (GNN) architectures from the perspective of random graph theory.
We show how both specific higher-order structures in synthetic data and the mix of empirical structures in real data have dramatic effects on GNN performance.
arXiv Detail & Related papers (2023-10-11T17:16:33Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - GNN-SL: Sequence Labeling Based on Nearest Examples via GNN [50.55076156520809]
We introduce graph neural networks sequence labeling (GNN-SL)
GNN-SL augments vanilla sequence labeling model output with similar tagging examples retrieved from the whole training set.
We conduct a variety of experiments on three typical sequence labeling tasks.
GNN-SL achieves results of 96.9 (+0.2) on PKU, 98.3 (+0.4) on CITYU, 98.5 (+0.2) on MSR, and 96.9 (+0.2) on AS for the CWS task.
arXiv Detail & Related papers (2022-12-05T04:22:00Z) - Batch-Ensemble Stochastic Neural Networks for Out-of-Distribution
Detection [55.028065567756066]
Out-of-distribution (OOD) detection has recently received much attention from the machine learning community due to its importance in deploying machine learning models in real-world applications.
In this paper we propose an uncertainty quantification approach by modelling the distribution of features.
We incorporate an efficient ensemble mechanism, namely batch-ensemble, to construct the batch-ensemble neural networks (BE-SNNs) and overcome the feature collapse problem.
We show that BE-SNNs yield superior performance on several OOD benchmarks, such as the Two-Moons dataset, the FashionMNIST vs MNIST dataset, FashionM
arXiv Detail & Related papers (2022-06-26T16:00:22Z) - AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020 [29.511523832243046]
We present AutoHEnsGNN, a framework to build effective and robust models for graph tasks without any human intervention.
AutoHEnsGNN won first place in the AutoGraph Challenge for KDD Cup 2020.
arXiv Detail & Related papers (2021-11-25T07:23:44Z) - ASFGNN: Automated Separated-Federated Graph Neural Network [17.817867271722093]
We propose an automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm.
We conduct experiments on benchmark datasets and the results demonstrate that ASFGNN significantly outperforms the naive federated GNN.
arXiv Detail & Related papers (2020-11-06T09:21:34Z) - Bayesian Graph Neural Networks with Adaptive Connection Sampling [62.51689735630133]
We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs)
The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs.
arXiv Detail & Related papers (2020-06-07T07:06:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.