Exploring Scalability in Large-Scale Time Series in DeepVATS framework
- URL: http://arxiv.org/abs/2408.04692v1
- Date: Thu, 8 Aug 2024 15:30:48 GMT
- Title: Exploring Scalability in Large-Scale Time Series in DeepVATS framework
- Authors: Inmaculada Santamaria-Valenzuela, Victor Rodriguez-Fernandez, David Camacho,
- Abstract summary: DeepVATS is a tool that merges Deep Learning (Deep) with Visual Analytics (VA) for the analysis of large time series data (TS)
The Deep Learning module, developed in R, manages the load of datasets and Deep Learning models from and to the Storage module.
This paper introduces the tool and examines its scalability through log analytics.
- Score: 3.8436076642278754
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Visual analytics is essential for studying large time series due to its ability to reveal trends, anomalies, and insights. DeepVATS is a tool that merges Deep Learning (Deep) with Visual Analytics (VA) for the analysis of large time series data (TS). It has three interconnected modules. The Deep Learning module, developed in R, manages the load of datasets and Deep Learning models from and to the Storage module. This module also supports models training and the acquisition of the embeddings from the latent space of the trained model. The Storage module operates using the Weights and Biases system. Subsequently, these embeddings can be analyzed in the Visual Analytics module. This module, based on an R Shiny application, allows the adjustment of the parameters related to the projection and clustering of the embeddings space. Once these parameters are set, interactive plots representing both the embeddings, and the time series are shown. This paper introduces the tool and examines its scalability through log analytics. The execution time evolution is examined while the length of the time series is varied. This is achieved by resampling a large data series into smaller subsets and logging the main execution and rendering times for later analysis of scalability.
Related papers
- Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - DeepVATS: Deep Visual Analytics for Time Series [7.822594828788055]
We present DeepVATS, an open-source tool that brings the field of Deep Visual Analytics into time series data.
DeepVATS trains, in a self-supervised way, a masked time series autoencoder that reconstructs patches of a time series.
We report on results that validate the utility of DeepVATS, running experiments on both synthetic and real datasets.
arXiv Detail & Related papers (2023-02-08T03:26:50Z) - Multivariate Time Series Regression with Graph Neural Networks [0.6124773188525718]
Recent advances in adapting Deep Learning to graphs have shown promising potential in various graph-related tasks.
However, these methods have not been adapted for time series related tasks to a great extent.
In this work, we propose an architecture capable of processing these long sequences in a multivariate time series regression task.
arXiv Detail & Related papers (2022-01-03T16:11:46Z) - Deeptime: a Python library for machine learning dynamical models from
time series data [3.346668383314945]
Deeptime is a general purpose Python library offering various tools to estimate dynamical models based on time-series data.
In this paper we introduce the main features and structure of the deeptime software.
arXiv Detail & Related papers (2021-10-28T10:53:03Z) - Multimodal Meta-Learning for Time Series Regression [3.135152720206844]
We will explore the idea of using meta-learning for quickly adapting model parameters to new short-history time series.
We show empirically that our proposed meta-learning method learns TSR with few data fast and outperforms the baselines in 9 of 12 experiments.
arXiv Detail & Related papers (2021-08-05T20:50:18Z) - Demystifying Deep Learning in Predictive Spatio-Temporal Analytics: An
Information-Theoretic Framework [20.28063653485698]
We provide a comprehensive framework for deep learning model design and information-theoretic analysis.
First, we develop and demonstrate a novel interactively-connected deep recurrent neural network (I$2$DRNN) model.
Second, to theoretically prove that our designed model can learn multi-scale-temporal dependency in PSTA tasks, we provide an information-theoretic analysis.
arXiv Detail & Related papers (2020-09-14T10:05:14Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - TAM: Temporal Adaptive Module for Video Recognition [60.83208364110288]
temporal adaptive module (bf TAM) generates video-specific temporal kernels based on its own feature map.
Experiments on Kinetics-400 and Something-Something datasets demonstrate that our TAM outperforms other temporal modeling methods consistently.
arXiv Detail & Related papers (2020-05-14T08:22:45Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.