IntLevPy: A Python library to classify and model intermittent and Lévy processes
- URL: http://arxiv.org/abs/2506.03729v1
- Date: Wed, 04 Jun 2025 09:03:58 GMT
- Title: IntLevPy: A Python library to classify and model intermittent and Lévy processes
- Authors: Shailendra Bhandari, Pedro Lencastre, Sergiy Denysov, Yurii Bystryk, Pedro G. Lind,
- Abstract summary: IntLevPy is a Python library designed for simulating and analyzing intermittent and L'evy processes.<n>The package includes functionalities for process simulation, including full parameter estimation and fitting optimization.<n>This paper provides an in-depth user guide covering IntLevPy software architecture, installation, validation, and usage examples.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: IntLevPy provides a comprehensive description of the IntLevPy Package, a Python library designed for simulating and analyzing intermittent and L\'evy processes. The package includes functionalities for process simulation, including full parameter estimation and fitting optimization for both families of processes, moment calculation, and classification methods. The classification methodology utilizes adjusted-$R^2$ and a noble performance measure {\Gamma}, enabling the distinction between intermittent and L\'evy processes. IntLevPy integrates iterative parameter optimization with simulation-based validation. This paper provides an in-depth user guide covering IntLevPy software architecture, installation, validation workflows, and usage examples. In this way, IntLevPy facilitates systematic exploration of these two broad classes of stochastic processes, bridging theoretical models and practical applications.
Related papers
- Information-theoretic Bayesian Optimization: Survey and Tutorial [2.3931689873603603]
This paper is about the information theoretical acquisition functions, whose performance typically outperforms the rest acquisition functions.<n>We also cover how information theory acquisition functions can be adapted to complex optimization scenarios such as the multi-objective, constrained, non-myopic, multi-fidelity, parallel and asynchronous.
arXiv Detail & Related papers (2025-01-22T10:54:15Z) - Efficient Learning of POMDPs with Known Observation Model in Average-Reward Setting [56.92178753201331]
We propose the Observation-Aware Spectral (OAS) estimation technique, which enables the POMDP parameters to be learned from samples collected using a belief-based policy.
We show the consistency of the OAS procedure, and we prove a regret guarantee of order $mathcalO(sqrtT log(T)$ for the proposed OAS-UCRL algorithm.
arXiv Detail & Related papers (2024-10-02T08:46:34Z) - Process Mining Embeddings: Learning Vector Representations for Petri Nets [0.09999629695552192]
We introduce PetriNet2Vec, an unsupervised methodology inspired by Doc2Vec.
This approach converts Petri nets into embedding vectors, facilitating the comparison, clustering, and classification of process models.
The results demonstrate the potential of PetriNet2Vec to significantly enhance process mining capabilities.
arXiv Detail & Related papers (2024-04-26T03:07:32Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Sparse Variational Student-t Processes [8.46450148172407]
Student-t Processes are used to model heavy-tailed distributions and datasets with outliers.
We propose a sparse representation framework to allow Student-t Processes to be more flexible for real-world datasets.
We evaluate two proposed approaches on various synthetic and real-world datasets from UCI and Kaggle.
arXiv Detail & Related papers (2023-12-09T12:55:20Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Likelihood-based inference and forecasting for trawl processes: a
stochastic optimization approach [0.0]
We develop the first likelihood-based methodology for the inference of real-valued trawl processes.
We introduce novel deterministic and probabilistic forecasting methods.
We release a Python library which can be used to fit a large class of trawl processes.
arXiv Detail & Related papers (2023-08-30T15:37:48Z) - Low-Rank Multitask Learning based on Tensorized SVMs and LSSVMs [65.42104819071444]
Multitask learning (MTL) leverages task-relatedness to enhance performance.
We employ high-order tensors, with each mode corresponding to a task index, to naturally represent tasks referenced by multiple indices.
We propose a general framework of low-rank MTL methods with tensorized support vector machines (SVMs) and least square support vector machines (LSSVMs)
arXiv Detail & Related papers (2023-08-30T14:28:26Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Safe Real-Time Optimization using Multi-Fidelity Gaussian Processes [0.0]
This paper proposes a new class of real-time optimization schemes to overcome system mismatch of uncertain processes.
The proposed scheme uses two Gaussian processes for the system, one emulates the known process model, and another, the true system through measurements.
arXiv Detail & Related papers (2021-11-10T09:31:10Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.