Multi-Decoder RNN Autoencoder Based on Variational Bayes Method
- URL: http://arxiv.org/abs/2004.14016v1
- Date: Wed, 29 Apr 2020 08:25:07 GMT
- Title: Multi-Decoder RNN Autoencoder Based on Variational Bayes Method
- Authors: Daisuke Kaji, Kazuho Watanabe, Masahiro Kobayashi
- Abstract summary: We propose a new clustering algorithm focusing on the generative process of the signal with a recurrent neural network and the variational Bayes method.
Our experiments show that the proposed algorithm not only has a robustness against for phase shift, amplitude and signal length variations but also provide a flexible clustering based on the property of the variational Bayes method.
- Score: 0.3007949058551534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Clustering algorithms have wide applications and play an important role in
data analysis fields including time series data analysis. However, in time
series analysis, most of the algorithms used signal shape features or the
initial value of hidden variable of a neural network. Little has been discussed
on the methods based on the generative model of the time series. In this paper,
we propose a new clustering algorithm focusing on the generative process of the
signal with a recurrent neural network and the variational Bayes method. Our
experiments show that the proposed algorithm not only has a robustness against
for phase shift, amplitude and signal length variations but also provide a
flexible clustering based on the property of the variational Bayes method.
Related papers
- Deep Learning and genetic algorithms for cosmological Bayesian inference speed-up [0.0]
We present a novel approach to accelerate the Bayesian inference process, focusing specifically on the nested sampling algorithms.
Our proposed method utilizes the power of deep learning, employing feedforward neural networks to approximate the likelihood function dynamically during the Bayesian inference process.
The implementation integrates with nested sampling algorithms and has been thoroughly evaluated using both simple cosmological dark energy models and diverse observational datasets.
arXiv Detail & Related papers (2024-05-06T09:14:58Z) - Embedded feature selection in LSTM networks with multi-objective
evolutionary ensemble learning for time series forecasting [49.1574468325115]
We present a novel feature selection method embedded in Long Short-Term Memory networks.
Our approach optimize the weights and biases of the LSTM in a partitioned manner.
Experimental evaluations on air quality time series data from Italy and southeast Spain demonstrate that our method substantially improves the ability generalization of conventional LSTMs.
arXiv Detail & Related papers (2023-12-29T08:42:10Z) - Detection of Anomalies in Multivariate Time Series Using Ensemble
Techniques [3.2422067155309806]
We propose an ensemble technique that combines multiple base models toward the final decision.
A semi-supervised approach using a Logistic Regressor to combine the base models' outputs is also proposed.
The performance improvement in terms of anomaly detection accuracy reaches 2% for the unsupervised and at least 10% for the semi-supervised models.
arXiv Detail & Related papers (2023-08-06T17:51:22Z) - Custom DNN using Reward Modulated Inverted STDP Learning for Temporal
Pattern Recognition [0.0]
Temporal spike recognition plays a crucial role in various domains, including anomaly detection, keyword spotting and neuroscience.
This paper presents a novel algorithm for efficient temporal spike pattern recognition on sparse event series data.
arXiv Detail & Related papers (2023-07-15T18:57:27Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Autoencoder Based Iterative Modeling and Multivariate Time-Series
Subsequence Clustering Algorithm [0.0]
This paper introduces an algorithm for the detection of change-points and the identification of the corresponding subsequences in transient time-series data (MTSD)
We use a recurrent neural network (RNN) based Autoencoder (AE) which is iteratively trained on incoming data.
A model of the identified subsequence is saved and used for recognition of repeating subsequences as well as fast offline clustering.
arXiv Detail & Related papers (2022-09-09T09:59:56Z) - Scalable computation of prediction intervals for neural networks via
matrix sketching [79.44177623781043]
Existing algorithms for uncertainty estimation require modifying the model architecture and training procedure.
This work proposes a new algorithm that can be applied to a given trained neural network and produces approximate prediction intervals.
arXiv Detail & Related papers (2022-05-06T13:18:31Z) - Connecting Weighted Automata, Tensor Networks and Recurrent Neural
Networks through Spectral Learning [58.14930566993063]
We present connections between three models used in different research fields: weighted finite automata(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks.
We introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous vectors input.
arXiv Detail & Related papers (2020-10-19T15:28:00Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z) - Generalization of Change-Point Detection in Time Series Data Based on
Direct Density Ratio Estimation [1.929039244357139]
We show how existing algorithms can be generalized using various binary classification and regression models.
The algorithms are tested on several synthetic and real-world datasets.
arXiv Detail & Related papers (2020-01-17T15:45:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.