Learning K-U-Net with constant complexity: An Application to time series forecasting
- URL: http://arxiv.org/abs/2410.02438v1
- Date: Thu, 3 Oct 2024 12:35:17 GMT
- Title: Learning K-U-Net with constant complexity: An Application to time series forecasting
- Authors: Jiang You, Arben Cela, René Natowicz, Jacob Ouanounou, Patrick Siarry,
- Abstract summary: Training deep models for time series forecasting is a critical task with an inherent challenge of time complexity.
We introduce a new exponentially weighted gradient descent algorithm designed to achieve constant time complexity in deep learning models.
- Score: 1.8816077341295625
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Training deep models for time series forecasting is a critical task with an inherent challenge of time complexity. While current methods generally ensure linear time complexity, our observations on temporal redundancy show that high-level features are learned 98.44\% slower than low-level features. To address this issue, we introduce a new exponentially weighted stochastic gradient descent algorithm designed to achieve constant time complexity in deep learning models. We prove that the theoretical complexity of this learning method is constant. Evaluation of this method on Kernel U-Net (K-U-Net) on synthetic datasets shows a significant reduction in complexity while improving the accuracy of the test set.
Related papers
- Krylov Complexity as a Probe for Chaos [0.7373617024876725]
We show that the dynamics towards saturation precisely distinguish between chaotic and integrable systems.
For chaotic models, the saturation value of complexity reaches its infinite time average at a finite saturation time.
In integrable models, complexity approaches the infinite time average value from below at a much longer timescale.
arXiv Detail & Related papers (2024-08-19T17:52:42Z) - Concrete Dense Network for Long-Sequence Time Series Clustering [4.307648859471193]
Time series clustering is fundamental in data analysis for discovering temporal patterns.
Deep temporal clustering methods have been trying to integrate the canonical k-means into end-to-end training of neural networks.
LoSTer is a novel dense autoencoder architecture for the long-sequence time series clustering problem.
arXiv Detail & Related papers (2024-05-08T12:31:35Z) - Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series [26.77596449192451]
Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
arXiv Detail & Related papers (2023-08-06T21:10:30Z) - Never a Dull Moment: Distributional Properties as a Baseline for
Time-Series Classification [0.0]
We evaluate the performance of an extremely simple classification approach.
We find that a simple linear model based on the mean and standard deviation performs better at classifying individuals with schizophrenia.
arXiv Detail & Related papers (2023-03-31T05:55:54Z) - Optimal Algorithms for Stochastic Complementary Composite Minimization [55.26935605535377]
Inspired by regularization techniques in statistics and machine learning, we study complementary composite minimization.
We provide novel excess risk bounds, both in expectation and with high probability.
Our algorithms are nearly optimal, which we prove via novel lower complexity bounds for this class of problems.
arXiv Detail & Related papers (2022-11-03T12:40:24Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - A Temporal Kernel Approach for Deep Learning with Continuous-time
Information [18.204325860752768]
Sequential deep learning models such as RNN, causal CNN and attention mechanism do not readily consume continuous-time information.
Discretizing the temporal data, as we show, causes inconsistency even for simple continuous-time processes.
We provide a principled way to characterize continuous-time systems using deep learning tools.
arXiv Detail & Related papers (2021-03-28T20:13:53Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Neural Complexity Measures [96.06344259626127]
We propose Neural Complexity (NC), a meta-learning framework for predicting generalization.
Our model learns a scalar complexity measure through interactions with many heterogeneous tasks in a data-driven way.
arXiv Detail & Related papers (2020-08-07T02:12:10Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.