Closed-form Continuous-Depth Models
- URL: http://arxiv.org/abs/2106.13898v1
- Date: Fri, 25 Jun 2021 22:08:51 GMT
- Title: Closed-form Continuous-Depth Models
- Authors: Ramin Hasani, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Max
Tschaikowski, Gerald Teschl, Daniela Rus
- Abstract summary: Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
- Score: 99.40335716948101
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Continuous-depth neural models, where the derivative of the model's hidden
state is defined by a neural network, have enabled strong sequential data
processing capabilities. However, these models rely on advanced numerical
differential equation (DE) solvers resulting in a significant overhead both in
terms of computational cost and model complexity. In this paper, we present a
new family of models, termed Closed-form Continuous-depth (CfC) networks, that
are simple to describe and at least one order of magnitude faster while
exhibiting equally strong modeling abilities compared to their ODE-based
counterparts. The models are hereby derived from the analytical closed-form
solution of an expressive subset of time-continuous models, thus alleviating
the need for complex DE solvers all together. In our experimental evaluations,
we demonstrate that CfC networks outperform advanced, recurrent models over a
diverse set of time-series prediction tasks, including those with long-term
dependencies and irregularly sampled data. We believe our findings open new
opportunities to train and deploy rich, continuous neural models in
resource-constrained settings, which demand both performance and efficiency.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Neural Network-Based Piecewise Survival Models [0.3999851878220878]
A family of neural network-based survival models is presented.
The models can be seen as an extension of the commonly used discrete-time and piecewise exponential models.
arXiv Detail & Related papers (2024-03-27T15:08:00Z) - Learning Space-Time Continuous Neural PDEs from Partially Observed
States [13.01244901400942]
We introduce a grid-independent model learning partial differential equations (PDEs) from noisy and partial observations on irregular grids.
We propose a space-time continuous latent neural PDE model with an efficient probabilistic framework and a novel design encoder for improved data efficiency and grid independence.
arXiv Detail & Related papers (2023-07-09T06:53:59Z) - Learning PDE Solution Operator for Continuous Modeling of Time-Series [1.39661494747879]
This work presents a partial differential equation (PDE) based framework which improves the dynamics modeling capability.
We propose a neural operator that can handle time continuously without requiring iterative operations or specific grids of temporal discretization.
Our framework opens up a new way for a continuous representation of neural networks that can be readily adopted for real-world applications.
arXiv Detail & Related papers (2023-02-02T03:47:52Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Neural Closure Models for Dynamical Systems [35.000303827255024]
We develop a novel methodology to learn non-Markovian closure parameterizations for low-fidelity models.
New "neural closure models" augment low-fidelity models with neural delay differential equations (nDDEs)
We show that using non-Markovian over Markovian closures improves long-term accuracy and requires smaller networks.
arXiv Detail & Related papers (2020-12-27T05:55:33Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.