Universal time-series forecasting with mixture predictors
- URL: http://arxiv.org/abs/2010.00297v1
- Date: Thu, 1 Oct 2020 10:56:23 GMT
- Title: Universal time-series forecasting with mixture predictors
- Authors: Daniil Ryabko
- Abstract summary: This book is devoted to the problem of sequential probability forecasting, that is, predicting the probabilities of the next outcome of a growing sequence of observations given the past.
Main subject is the mixture predictors, which are formed as a combination of a finite or infinite set of other predictors.
Results demonstrate the universality of this method in a very general probabilistic setting, but also show some of its limitations.
- Score: 10.812772606528172
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This book is devoted to the problem of sequential probability forecasting,
that is, predicting the probabilities of the next outcome of a growing sequence
of observations given the past. This problem is considered in a very general
setting that unifies commonly used probabilistic and non-probabilistic
settings, trying to make as few as possible assumptions on the mechanism
generating the observations. A common form that arises in various formulations
of this problem is that of mixture predictors, which are formed as a
combination of a finite or infinite set of other predictors attempting to
combine their predictive powers. The main subject of this book are such mixture
predictors, and the main results demonstrate the universality of this method in
a very general probabilistic setting, but also show some of its limitations.
While the problems considered are motivated by practical applications,
involving, for example, financial, biological or behavioural data, this
motivation is left implicit and all the results exposed are theoretical.
The book targets graduate students and researchers interested in the problem
of sequential prediction, and, more generally, in theoretical analysis of
problems in machine learning and non-parametric statistics, as well as
mathematical and philosophical foundations of these fields.
The material in this volume is presented in a way that presumes familiarity
with basic concepts of probability and statistics, up to and including
probability distributions over spaces of infinite sequences. Familiarity with
the literature on learning or stochastic processes is not required.
Related papers
- Theoretical Foundations of Conformal Prediction [15.884682750072399]
Conformal prediction and related inferential techniques are useful in a diverse array of tasks.
Conformal prediction's main appeal is its ability to provide formal, finite-sample guarantees.
The goal of this book is to teach the reader about the fundamental technical arguments that arise when researching conformal prediction.
arXiv Detail & Related papers (2024-11-18T18:44:00Z) - Prediction Instability in Machine Learning Ensembles [0.0]
We prove a theorem that shows that any ensemble will exhibit at least one of the following forms of prediction instability.
It will either ignore agreement among all underlying models, change its mind when none of the underlying models have done so, or be manipulable through inclusion or exclusion of options it would never actually predict.
This analysis also sheds light on what specific forms of prediction instability to expect from particular ensemble algorithms.
arXiv Detail & Related papers (2024-07-03T15:26:02Z) - Seeing Unseen: Discover Novel Biomedical Concepts via
Geometry-Constrained Probabilistic Modeling [53.7117640028211]
We present a geometry-constrained probabilistic modeling treatment to resolve the identified issues.
We incorporate a suite of critical geometric properties to impose proper constraints on the layout of constructed embedding space.
A spectral graph-theoretic method is devised to estimate the number of potential novel classes.
arXiv Detail & Related papers (2024-03-02T00:56:05Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - A Mathematical Framework for Learning Probability Distributions [0.0]
generative modeling and density estimation has become an immensely popular subject in recent years.
This paper provides a mathematical framework such that all the well-known models can be derived based on simple principles.
In particular, we prove that these models enjoy implicit regularization during training, so that the generalization error at early-stopping avoids the curse of dimensionality.
arXiv Detail & Related papers (2022-12-22T04:41:45Z) - Risk Measures and Upper Probabilities: Coherence and Stratification [7.88657961743755]
We look at richer alternatives to classical probability theory as a mathematical foundation for machine learning.
We examine a powerful and rich class of alternative aggregation functionals, known variously as spectral risk measures, Choquet integrals or Lorentz norms.
We empirically demonstrate how this new approach to uncertainty helps tackling practical machine learning problems.
arXiv Detail & Related papers (2022-06-07T11:08:16Z) - A Top-Down Approach to Hierarchically Coherent Probabilistic Forecasting [21.023456590248827]
We use a novel attention-based RNN model to learn the distribution of the proportions according to which each parent prediction is split among its children nodes at any point in time.
The resulting forecasts are computed in a top-down fashion and are naturally coherent.
arXiv Detail & Related papers (2022-04-21T21:32:28Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Video Prediction via Example Guidance [156.08546987158616]
In video prediction tasks, one major challenge is to capture the multi-modal nature of future contents and dynamics.
In this work, we propose a simple yet effective framework that can efficiently predict plausible future states.
arXiv Detail & Related papers (2020-07-03T14:57:24Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.