A Non-linear Function-on-Function Model for Regression with Time Series
Data
- URL: http://arxiv.org/abs/2011.12378v1
- Date: Tue, 24 Nov 2020 20:51:27 GMT
- Title: A Non-linear Function-on-Function Model for Regression with Time Series
Data
- Authors: Qiyao Wang, Haiyan Wang, Chetan Gupta, Aniruddha Rajendra Rao, Hamed
Khorasgani
- Abstract summary: We propose a general functional mapping that embraces the function-on-function linear model as a special case.
We then propose a non-linear function-on-function model using the fully connected neural network to learn the mapping from data.
The effectiveness of the proposed model is demonstrated through the application to two real-world problems.
- Score: 11.738565299608721
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the last few decades, building regression models for non-scalar variables,
including time series, text, image, and video, has attracted increasing
interests of researchers from the data analytic community. In this paper, we
focus on a multivariate time series regression problem. Specifically, we aim to
learn mathematical mappings from multiple chronologically measured numerical
variables within a certain time interval S to multiple numerical variables of
interest over time interval T. Prior arts, including the multivariate
regression model, the Seq2Seq model, and the functional linear models, suffer
from several limitations. The first two types of models can only handle
regularly observed time series. Besides, the conventional multivariate
regression models tend to be biased and inefficient, as they are incapable of
encoding the temporal dependencies among observations from the same time
series. The sequential learning models explicitly use the same set of
parameters along time, which has negative impacts on accuracy. The
function-on-function linear model in functional data analysis (a branch of
statistics) is insufficient to capture complex correlations among the
considered time series and suffer from underfitting easily. In this paper, we
propose a general functional mapping that embraces the function-on-function
linear model as a special case. We then propose a non-linear
function-on-function model using the fully connected neural network to learn
the mapping from data, which addresses the aforementioned concerns in the
existing approaches. For the proposed model, we describe in detail the
corresponding numerical implementation procedures. The effectiveness of the
proposed model is demonstrated through the application to two real-world
problems.
Related papers
- Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Analysis of Interpolating Regression Models and the Double Descent
Phenomenon [3.883460584034765]
It is commonly assumed that models which interpolate noisy training data are poor to generalize.
The best models obtained are overparametrized and the testing error exhibits the double descent behavior as the model order increases.
We derive a result based on the behavior of the smallest singular value of the regression matrix that explains the peak location and the double descent shape of the testing error as a function of model order.
arXiv Detail & Related papers (2023-04-17T09:44:33Z) - A Pattern Discovery Approach to Multivariate Time Series Forecasting [27.130141538089152]
State-of-the-art deep learning methods fail to construct models for full time series because model complexity grows exponentially with time series length.
We propose a novel pattern discovery method that can automatically capture diverse and complex time series patterns.
We also propose a learnable correlation matrix, that enables the model to capture distinct correlations among multiple time series.
arXiv Detail & Related papers (2022-12-20T14:54:04Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Deep Time Series Models for Scarce Data [8.673181404172963]
Time series data have grown at an explosive rate in numerous domains and have stimulated a surge of time series modeling research.
Data scarcity is a universal issue that occurs in a vast range of data analytics problems.
arXiv Detail & Related papers (2021-03-16T22:16:54Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Adjusting for Autocorrelated Errors in Neural Networks for Time Series
Regression and Forecasting [10.659189276058948]
We learn the autocorrelation coefficient jointly with the model parameters in order to adjust for autocorrelated errors.
For time series regression, large-scale experiments indicate that our method outperforms the Prais-Winsten method.
Results across a wide range of real-world datasets show that our method enhances performance in almost all cases.
arXiv Detail & Related papers (2021-01-28T04:25:51Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.