Deep Set Neural Networks for forecasting asynchronous bioprocess
timeseries
- URL: http://arxiv.org/abs/2312.02079v2
- Date: Tue, 5 Dec 2023 22:20:50 GMT
- Title: Deep Set Neural Networks for forecasting asynchronous bioprocess
timeseries
- Authors: Maxim Borisyak, Stefan Born, Peter Neubauer and Mariano Nicolas
Cruz-Bournazou
- Abstract summary: Cultivation experiments often produce sparse and irregular time series.
Most statistical and Machine Learning tools are not designed for handling sparse data out-of-the-box.
We show that Deep Set Neural Networks equipped with triplet encoding of the input data can successfully handle bio-process data without any need for imputation or alignment procedures.
- Score: 0.28675177318965045
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cultivation experiments often produce sparse and irregular time series.
Classical approaches based on mechanistic models, like Maximum Likelihood
fitting or Monte-Carlo Markov chain sampling, can easily account for sparsity
and time-grid irregularities, but most statistical and Machine Learning tools
are not designed for handling sparse data out-of-the-box. Among popular
approaches there are various schemes for filling missing values (imputation)
and interpolation into a regular grid (alignment). However, such methods
transfer the biases of the interpolation or imputation models to the target
model. We show that Deep Set Neural Networks equipped with triplet encoding of
the input data can successfully handle bio-process data without any need for
imputation or alignment procedures. The method is agnostic to the particular
nature of the time series and can be adapted for any task, for example, online
monitoring, predictive control, design of experiments, etc. In this work, we
focus on forecasting. We argue that such an approach is especially suitable for
typical cultivation processes, demonstrate the performance of the method on
several forecasting tasks using data generated from macrokinetic growth models
under realistic conditions, and compare the method to a conventional fitting
procedure and methods based on imputation and alignment.
Related papers
- StreamEnsemble: Predictive Queries over Spatiotemporal Streaming Data [0.8437187555622164]
We propose StreamEnembles, a novel approach to predictive queries overtemporal (ST) data distributions.
Our experimental evaluation reveals that this method markedly outperforms traditional ensemble methods and single model approaches in terms of accuracy and time.
arXiv Detail & Related papers (2024-09-30T23:50:16Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - HyperImpute: Generalized Iterative Imputation with Automatic Model
Selection [77.86861638371926]
We propose a generalized iterative imputation framework for adaptively and automatically configuring column-wise models.
We provide a concrete implementation with out-of-the-box learners, simulators, and interfaces.
arXiv Detail & Related papers (2022-06-15T19:10:35Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Monte Carlo EM for Deep Time Series Anomaly Detection [6.312089019297173]
Time series data are often corrupted by outliers or other kinds of anomalies.
Recent approaches to anomaly detection and forecasting assume that the proportion of anomalies in the training data is small enough to ignore.
We present a technique for augmenting existing time series models so that they explicitly account for anomalies in the training data.
arXiv Detail & Related papers (2021-12-29T07:52:36Z) - A Meta-learning Approach to Reservoir Computing: Time Series Prediction
with Limited Data [0.0]
We present a data-driven approach to automatically extract an appropriate model structure from experimentally observed processes.
We demonstrate our approach on a simple benchmark problem, where it beats the state of the art meta-learning techniques.
arXiv Detail & Related papers (2021-10-07T18:23:14Z) - Randomized Neural Networks for Forecasting Time Series with Multiple
Seasonality [0.0]
This work contributes to the development of neural forecasting models with novel randomization-based learning methods.
A pattern-based representation of time series makes the proposed approach useful for forecasting time series with multiple seasonality.
arXiv Detail & Related papers (2021-07-04T18:39:27Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Nonparametric Estimation in the Dynamic Bradley-Terry Model [69.70604365861121]
We develop a novel estimator that relies on kernel smoothing to pre-process the pairwise comparisons over time.
We derive time-varying oracle bounds for both the estimation error and the excess risk in the model-agnostic setting.
arXiv Detail & Related papers (2020-02-28T21:52:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.