Merlion: A Machine Learning Library for Time Series
- URL: http://arxiv.org/abs/2109.09265v1
- Date: Mon, 20 Sep 2021 02:03:43 GMT
- Title: Merlion: A Machine Learning Library for Time Series
- Authors: Aadyot Bhatnagar, Paul Kassianik, Chenghao Liu, Tian Lan, Wenzhuo
Yang, Rowan Cassius, Doyen Sahoo, Devansh Arpit, Sri Subramanian, Gerald Woo,
Amrita Saha, Arun Kumar Jagota, Gokulakrishnan Gopalakrishnan, Manpreet
Singh, K C Krithika, Sukumar Maddineni, Daeki Cho, Bo Zong, Yingbo Zhou,
Caiming Xiong, Silvio Savarese, Steven Hoi, Huan Wang
- Abstract summary: Merlion is an open-source machine learning library for time series.
It features a unified interface for models and datasets for anomaly detection and forecasting.
Merlion also provides a unique evaluation framework that simulates the live deployment and re-training of a model in production.
- Score: 73.46386700728577
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce Merlion, an open-source machine learning library for time
series. It features a unified interface for many commonly used models and
datasets for anomaly detection and forecasting on both univariate and
multivariate time series, along with standard pre/post-processing layers. It
has several modules to improve ease-of-use, including visualization, anomaly
score calibration to improve interpetability, AutoML for hyperparameter tuning
and model selection, and model ensembling. Merlion also provides a unique
evaluation framework that simulates the live deployment and re-training of a
model in production. This library aims to provide engineers and researchers a
one-stop solution to rapidly develop models for their specific time series
needs and benchmark them across multiple time series datasets. In this
technical report, we highlight Merlion's architecture and major
functionalities, and we report benchmark numbers across different baseline
models and ensembles.
Related papers
- TraM : Enhancing User Sleep Prediction with Transformer-based Multivariate Time Series Modeling and Machine Learning Ensembles [3.565151496245487]
This paper presents a novel approach that leverages Transformer-based multivariate time series model and Machine Learning Ensembles to predict the quality of human sleep, emotional states, and stress levels.
Time Series Transformer was used for labels where time series characteristics are crucial, while Machine Learning Ensembles were employed for labels requiring comprehensive daily activity statistics.
The proposed model, TraM, scored 6.10 out of 10 in experiments, demonstrating superior performance compared to other methodologies.
arXiv Detail & Related papers (2024-10-15T05:29:55Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - Multiple-Resolution Tokenization for Time Series Forecasting with an Application to Pricing [41.94295877935867]
We propose a transformer architecture for time series forecasting with a focus on time series tokenisation.
Our architecture aims to learn effective representations at many scales across all available data simultaneously.
We present an application of this model to a real world prediction problem faced by the markdown team at a very large retailer.
arXiv Detail & Related papers (2024-07-03T15:07:16Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - UNITS: A Unified Multi-Task Time Series Model [31.675845788410246]
We introduce UniTS, a multi-task time series model that uses task tokenization to express predictive and generative tasks within a single model.
Across 38 datasets spanning human activity sensors, healthcare, engineering, and finance domains, UniTS model performs favorably against 12 forecasting models, 20 classification models, 18 anomaly detection models, and 16 imputation models.
arXiv Detail & Related papers (2024-02-29T21:25:58Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - MTS-CycleGAN: An Adversarial-based Deep Mapping Learning Network for
Multivariate Time Series Domain Adaptation Applied to the Ironmaking Industry [0.0]
This research focuses on translating the specific asset-based historical data (source domain) into data corresponding to one reference asset (target domain)
We propose MTS-CycleGAN, an algorithm for Multivariate Time Series data based on CycleGAN.
Our contribution is the integration in the CycleGAN architecture of a Long Short-Term Memory (LSTM)-based AutoEncoder (AE) for the generator and a stacked LSTM-based discriminator.
arXiv Detail & Related papers (2020-07-15T07:33:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.