GP-ConvCNP: Better Generalization for Convolutional Conditional Neural
Processes on Time Series Data
- URL: http://arxiv.org/abs/2106.04967v2
- Date: Fri, 11 Jun 2021 13:46:13 GMT
- Title: GP-ConvCNP: Better Generalization for Convolutional Conditional Neural
Processes on Time Series Data
- Authors: Jens Petersen, Gregor K\"ohler, David Zimmerer, Fabian Isensee, Paul
F. J\"ager, Klaus H. Maier-Hein
- Abstract summary: Convolutional Conditional Neural Processes (ConvCNP) have shown remarkable improvement in performance over prior art.
We find that they sometimes struggle to generalize when applied to time series data.
In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future.
- Score: 4.141867179461668
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Processes (NPs) are a family of conditional generative models that are
able to model a distribution over functions, in a way that allows them to
perform predictions at test time conditioned on a number of context points. A
recent addition to this family, Convolutional Conditional Neural Processes
(ConvCNP), have shown remarkable improvement in performance over prior art, but
we find that they sometimes struggle to generalize when applied to time series
data. In particular, they are not robust to distribution shifts and fail to
extrapolate observed patterns into the future. By incorporating a Gaussian
Process into the model, we are able to remedy this and at the same time improve
performance within distribution. As an added benefit, the Gaussian Process
reintroduces the possibility to sample from the model, a key feature of other
members in the NP family.
Related papers
- Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting [43.951394031702016]
We introduce TSFlow, a conditional flow matching (CFM) model for time series.
By incorporating (conditional) Gaussian processes, TSFlow aligns the prior distribution more closely with the temporal structure of the data.
We show that both conditionally and unconditionally trained models achieve competitive results in forecasting benchmarks.
arXiv Detail & Related papers (2024-10-03T22:12:50Z) - Spectral Convolutional Conditional Neural Processes [4.52069311861025]
Conditional Neural Processes (CNPs) constitute a family of probabilistic models that harness the flexibility of neural networks to parameterize processes.
We propose Spectral Convolutional Conditional Neural Processes (SConvCNPs), a new addition to the NPs family that allows for more efficient representation of functions in the frequency domain.
arXiv Detail & Related papers (2024-04-19T21:13:18Z) - Conditional Neural Processes for Molecules [0.0]
Neural processes (NPs) are models for transfer learning with properties reminiscent of Gaussian Processes (GPs)
This paper applies the conditional neural process (CNP) to DOCKSTRING, a dataset of docking scores for benchmarking ML models.
CNPs show competitive performance in few-shot learning tasks relative to supervised learning baselines common in QSAR modelling, as well as an alternative model for transfer learning based on pre-training and refining neural network regressors.
arXiv Detail & Related papers (2022-10-17T16:10:12Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - The Gaussian Neural Process [39.81327564209865]
We provide a rigorous analysis of the standard maximum-likelihood objective used to train conditional NPs.
We propose a new member to the Neural Process family called the Neural Process (GNP), which models predictive correlations, incorporates translation, provides universal approximation guarantees, and demonstrates encouraging performance.
arXiv Detail & Related papers (2021-01-10T19:15:27Z) - Probabilistic Numeric Convolutional Neural Networks [80.42120128330411]
Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods.
We propose Probabilistic Convolutional Neural Networks which represent features as Gaussian processes (GPs)
We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity.
In experiments we show that our approach yields a $3times$ reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time2012 dataset PhysioNet.
arXiv Detail & Related papers (2020-10-21T10:08:21Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z) - Meta-Learning Stationary Stochastic Process Prediction with
Convolutional Neural Processes [32.02612871707347]
We propose ConvNP, which endows Neural Processes (NPs) with translation equivariance and extends convolutional conditional NPs to allow for dependencies in the predictive distribution.
We demonstrate the strong performance and generalization capabilities of ConvNPs on 1D, regression image completion, and various tasks with real-world-temporal data.
arXiv Detail & Related papers (2020-07-02T18:25:27Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.