Learning deep autoregressive models for hierarchical data
- URL: http://arxiv.org/abs/2104.13853v1
- Date: Wed, 28 Apr 2021 15:58:45 GMT
- Title: Learning deep autoregressive models for hierarchical data
- Authors: Carl R. Andersson, Niklas Wahlstr\"om, Thomas B. Sch\"on
- Abstract summary: We propose a model for hierarchical structured data as an extension to the temporal convolutional network (STCN)
We evaluate the proposed model on two different types of sequential data: speech and handwritten text.
- Score: 0.6445605125467573
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a model for hierarchical structured data as an extension to the
stochastic temporal convolutional network (STCN). The proposed model combines
an autoregressive model with a hierarchical variational autoencoder and
downsampling to achieve superior computational complexity. We evaluate the
proposed model on two different types of sequential data: speech and
handwritten text. The results are promising with the proposed model achieving
state-of-the-art performance.
Related papers
- Supervised Score-Based Modeling by Gradient Boosting [49.556736252628745]
We propose a Supervised Score-based Model (SSM) which can be viewed as a gradient boosting algorithm combining score matching.
We provide a theoretical analysis of learning and sampling for SSM to balance inference time and prediction accuracy.
Our model outperforms existing models in both accuracy and inference time.
arXiv Detail & Related papers (2024-11-02T07:06:53Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Deep Variational Models for Collaborative Filtering-based Recommender
Systems [63.995130144110156]
Deep learning provides accurate collaborative filtering models to improve recommender system results.
Our proposed models apply the variational concept to injectity in the latent space of the deep architecture.
Results show the superiority of the proposed approach in scenarios where the variational enrichment exceeds the injected noise effect.
arXiv Detail & Related papers (2021-07-27T08:59:39Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Split Modeling for High-Dimensional Logistic Regression [0.2676349883103404]
A novel method is proposed to an ensemble logistic classification model briefly compiled.
Our method learns how to exploit the bias-off resulting in excellent prediction accuracy.
An open-source software library implementing the proposed method is discussed.
arXiv Detail & Related papers (2021-02-17T05:57:26Z) - Analysis and tuning of hierarchical topic models based on Renyi entropy
approach [5.487882744996213]
tuning of parameters of hierarchical models, including the number of topics on each hierarchical level, remains a challenging task.
In this paper, we propose a Renyi entropy-based approach for a partial solution to the above problem.
arXiv Detail & Related papers (2021-01-19T12:54:47Z) - Hierarchical Representation via Message Propagation for Robust Model
Fitting [28.03005930782681]
We propose a novel hierarchical representation via message propagation (HRMP) method for robust model fitting.
We formulate the consensus information and the preference information as a hierarchical representation to alleviate the sensitivity to gross outliers.
The proposed HRMP can not only accurately estimate the number and parameters of multiple model instances, but also handle multi-structural data contaminated with a large number of outliers.
arXiv Detail & Related papers (2020-12-29T04:14:19Z) - Predictive process mining by network of classifiers and clusterers: the
PEDF model [0.0]
The PEDF model learns based on events' sequences, durations, and extra features.
The model requires to extract two sets of data from log files.
arXiv Detail & Related papers (2020-11-22T23:27:19Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Robust Finite Mixture Regression for Heterogeneous Targets [70.19798470463378]
We propose an FMR model that finds sample clusters and jointly models multiple incomplete mixed-type targets simultaneously.
We provide non-asymptotic oracle performance bounds for our model under a high-dimensional learning framework.
The results show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-10-12T03:27:07Z) - Predicting Multidimensional Data via Tensor Learning [0.0]
We develop a model that retains the intrinsic multidimensional structure of the dataset.
To estimate the model parameters, an Alternating Least Squares algorithm is developed.
The proposed model is able to outperform benchmark models present in the forecasting literature.
arXiv Detail & Related papers (2020-02-11T11:57:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.